![]() dual CMOS array imaging
专利摘要:
The present invention relates to an image capture system that includes a plurality of light sources, each configured to emit light that has a specified central wavelength, a first light sensing element that has a first field of light. vision and is configured to receive reflected illumination from a portion of a surgical site, a second light sensing element that has a second field of view and is configured to receive reflected illumination from a second portion of the surgical site and a system of computing. The computing system is configured to receive data from the first light detection element and the second light detection element, compute the imaging data based on the data received from the first and second light detection elements and transmit the data from imaging for reception by a display system. The second field of view can overlap at least a portion of the first field of view. An image capture system control system can work in a similar way. 公开号:BR112020012993A2 申请号:R112020012993-3 申请日:2018-07-30 公开日:2020-12-01 发明作者:Frederick E. Shelton Iv;Jason L. Harris;David C. Yates;Jerome R. Morgan 申请人:Ethicon Llc; IPC主号:
专利说明:
[001] [001] This application claims priority benefit under 35 U.S.C. 119 (e) to US provisional patent application serial number 62 / 649.291, entitled "USE OF LASER LIGHT AND RED-GREEN-BLUE [002] [002] The present application also claims priority under 35 USC 119 (e) to US provisional patent application serial number 62 / 611.341, entitled INTERACTIVE SURGICAL PLATFORM, filed on December 28, 2017, of US provisional patent application no. serial number 62 / 611.340, entitled CLOUD-BASED MEDICAL ANALYTICS, filed on December 28, 2017 and US provisional patent application serial number 62 / 611.339, entitled ROBOT ASSISTED SURGICAL PLATFORM, filed on December 28, 2017, being description of each of which is incorporated herein by reference in its entirety. BACKGROUND OF THE INVENTION [003] [003] The present invention relates to various surgical systems. Surgical procedures are typically performed in operating rooms at a health care facility, such as a hospital. A sterile field is typically created around the patient. The sterile field may include members of the brushing team, who are properly dressed, and all furniture and accessories in the area. Various surgical devices and systems are used to perform a surgical procedure. SUMMARY OF THE INVENTION [004] [004] In some ways, a minimally invasive image capture system can comprise a plurality of light sources, with each light source being configured to emit light with a specified central wavelength, a first element of light detection which has a first field of view and is configured to receive reflected light from a first portion of the surgical site when the first portion of the surgical site is illuminated by at least one of the plurality of light sources, a second light sensing element that has a second field of view and is configured to receive reflected illumination from a second portion of the surgical site when the second portion of the surgical site is illuminated by at least one of the plurality of lighting sources, and a computer system, the computing system is configured to receive data from the first light detecting element, receiving data from the second detecting element light analysis, compute the imaging data based on the data received from the first light detection element and the data received from the second light detection element and transmit the imaging data for reception by a display system. Also, the second field of view can overlap at least a portion of the first field of view. [005] [005] In one aspect of the minimally invasive image capture system, the first field of view has a first angle and the second field of view has a second angle, and the first angle is equal to the second angle. [006] [006] In one aspect of the minimally invasive image capture system, the first field of view has a first angle and the second field of view has a second angle and the first angle is different from the second angle. [007] [007] In an aspect of the minimally invasive image capture system, the first light detection element has an optical component configured to adjust the first field of view. [008] [008] In an aspect of the minimally invasive image capture system, the second light detection element has an optical component configured to adjust the second field of view. [009] [009] In one aspect of the minimally invasive image capture system, the second field of view overlaps the entire first field of view. [0010] [0010] In one aspect of the minimally invasive image capture system, the first field of view is completely surrounded by the second field of view. [0011] [0011] In one aspect of the minimally invasive image capture system, the first light sensing element and the second light sensing element are at least partially arranged inside a probe with an elongated camera. [0012] [0012] In one aspect of the minimally invasive image capture system, each of the plurality of light sources is configured to emit light that has a specified central wavelength within a visible spectrum. [0013] [0013] In one aspect of the minimally invasive image capture system, at least one of the plurality of light sources is configured to emit light that has a specified central wavelength outside a visible spectrum. [0014] [0014] In one aspect of the minimally invasive image capture system, the specified central wavelength outside the visible spectrum is within an ultraviolet range. [0015] [0015] In an aspect of the minimally invasive image capture system, the specified central wavelength outside the visible spectrum is within an infrared range. [0016] [0016] In one aspect of the minimally invasive image capture system, the computing system configured to compute the imaging data based on the data received from the first light detection element and the data received from the second light detection element comprises a computing system configured to perform a first data analysis on the data received from the first light detection element and a second analysis on the data received from the second light detection element. [0017] [0017] In one aspect of the minimally invasive image capture system, the first data analysis differs from the second data analysis. [0018] [0018] In some aspects, a minimally invasive image capture system consists of a processor and a memory attached to the processor. The memory can store instructions executable by the processor to control an operation of a plurality of light sources from a tissue sample, each light source being configured to emit light with a specified central wavelength, to receive, from a first element light detection, first data related to the reflected illumination of a first portion of the surgical site when the first portion of the surgical site is illuminated by at least one of the plurality of lighting sources, to receive, from a second light detection element, second data related to reflected illumination of a second portion of the surgical site when the second portion of the surgical site is illuminated by at least one of the plurality of lighting sources, to compute imaging data based on the first data received from the first detection element of light and in the second data received from the second light sensing element and transmit the imaging data for reception by a display system. In some ways, the second field of view overlaps at least a portion of the first field of view. [0019] [0019] In an aspect of the minimally invasive image capture system, the memory attached to the processor still stores instructions executable by the processor to receive, from a surgical instrument, operational data related to a function or to a state of the surgical instrument. [0020] [0020] In an aspect of the minimally invasive image capture system, the memory attached to the processor still stores instructions executable by the processor to compute imaging data based on the first data received from the first light detection element, in the second data received from the second light detection element and operational data related to the function or condition of the surgical instrument. [0021] [0021] In some ways, a minimally invasive image capture system may include a control circuit configured to control an operation from a plurality of light sources from a tissue sample, with each light source being configured to emit light with a specified central wavelength, receive, from a first light detection element, first data related to the reflected illumination of a first portion of the surgical site when the first portion of the surgical site is illuminated by at least one of the plurality of sources of illumination, receiving, from a second element of light detection, second data related to the reflected illumination of a second portion of the surgical site when the second portion of the surgical site is illuminated by at least one among the plurality of lighting sources, to compute data based on the first data received from the first light sensing element and the second data They are received from the second light sensing element and transmit the imaging data for reception by a display system. In some ways, the second field of view overlaps at least a portion of the first field of view. [0022] [0022] In some respects, a non-transitory, computer-readable medium can store computer-readable instructions that, when executed, cause the machine to control an operation from a plurality of lighting sources from a tissue sample, with each source of illumination is configured to emit light with a specified central wavelength, to receive, from a first light detecting element, first data related to the reflected illumination of a first portion of the surgical site when the first portion of the surgical site is illuminated by at at least one of the plurality of lighting sources, receiving, from a second light detection element, second data related to the reflected illumination of a second portion of the surgical site when the second portion of the surgical site is illuminated by at least one among the plurality lighting sources, compute imaging data based on the first data received from the first element light detection and in the second data received from the second light detection element and transmit the imaging data for reception by a display system. In some ways, the second field of view overlaps at least a portion of the first field of view. FIGURES [0023] [0023] The features of various aspects are presented with particularity in the attached claims. The various aspects, however, with regard to both the organization and the methods of operation, together with additional objects and advantages of the same, can be better understood in reference to the description presented below, considered together with the attached drawings, as follows. [0024] [0024] Figure 1 is a block diagram of an interactive surgical system implemented by computer, according to at least one aspect of the present description. [0025] [0025] Figure 2 is a surgical system being used to perform a surgical procedure in an operating room, in accordance with at least one aspect of the present description. [0026] [0026] Figure 3 is a central device or "central surgical controller" paired with a visualization system, a robotic system, and an intelligent instrument, in accordance with at least one aspect of the present description. [0027] [0027] Figure 4 is a partial perspective view of a central surgical controller compartment, and of a generator module in combination slidably received in a central surgical controller compartment, in accordance with at least one aspect of the present description. [0028] [0028] Figure 5 is a perspective view of a generator module in combination with bipolar, ultrasonic and monopolar contacts and a smoke evacuation component, in accordance with at least one aspect of the present description. [0029] [0029] Figure 6 illustrates different power bus connectors for a plurality of side coupling ports of a side modular cabinet configured to receive a plurality of modules, in accordance with at least one aspect of the present description. [0030] [0030] Figure 7 illustrates a vertical modular housing configured to receive a plurality of modules, according to at least one aspect of the present description. [0031] [0031] Figure 8 illustrates a surgical data network that comprises a central modular communication controller configured to connect modular devices located in one or more operating rooms of a healthcare facility or any environment in a utility facility especially equipped for surgical operations, to the cloud, in accordance with at least one aspect of this description. [0032] [0032] Figure 9 illustrates an interactive surgical system implemented by computer, in accordance with at least one aspect of the present description. [0033] [0033] Figure 10 illustrates a central surgical controller that comprises a plurality of modules coupled to the modular control tower, in accordance with at least one aspect of the present description. [0034] [0034] Figure 11 illustrates an aspect of a central controller device of a universal serial bus (USB) network, in accordance with at least one aspect of the present description. [0035] [0035] Figure 12 illustrates a logical diagram of a control system for an instrument or surgical tool, according to at least one aspect of the present description. [0036] [0036] Figure 13 illustrates a control circuit configured to control aspects of the instrument or surgical tool, according to at least one aspect of the present description. [0037] [0037] Figure 14 illustrates a combinational logic circuit configured to control aspects of the instrument or surgical tool, according to at least one aspect of the present description. [0038] [0038] Figure 15 illustrates a sequential logic circuit configured to control aspects of the instrument or surgical tool, according to at least one aspect of the present description. [0039] [0039] Figure 16 illustrates an instrument or surgical tool that comprises a plurality of motors that can be activated to perform various functions, according to at least one aspect of the present description. [0040] [0040] Figure 17 is a schematic diagram of a robotic surgical instrument configured to operate a surgical tool described therein, in accordance with at least one aspect of the present description. [0041] [0041] Figure 18 illustrates a block diagram of a surgical instrument programmed to control the distal translation of the displacement member, according to an aspect of the present description. [0042] [0042] Figure 19 is a schematic diagram of a surgical instrument configured to control various functions, in accordance with at least one aspect of the present description. [0043] [0043] Figure 20 is a simplified block diagram of a generator configured to provide adjustment without inductor, among other benefits, in accordance with at least one aspect of the present description. [0044] [0044] Figure 21 illustrates an example of a generator, which is a form of the generator of Figure 20, according to at least one aspect of the present description. [0045] [0045] Figure 22A illustrates a visualization system that can be incorporated into a surgical system, in accordance with at least one aspect of the present description. [0046] [0046] Figure 22B illustrates a top plan view of a manual unit of the display system of Figure 22A, in accordance with at least one aspect of the present description. [0047] [0047] Figure 22C illustrates a side plan view of the manual unit represented in Figure 22A together with an imaging sensor arranged therein, in accordance with at least one aspect of the present description. [0048] [0048] Figure 22D illustrates a plurality of imaging sensors represented in Figure 22C, in accordance with at least one aspect of the present description. [0049] [0049] Figure 23A illustrates a plurality of laser emitters that can be incorporated into the display system of Figure 22A, [0050] [0050] Figure 23B illustrates the illumination of an image sensor that has a Bayer color filter pattern, in accordance with at least one aspect of the present description. [0051] [0051] Figure 23C illustrates a graphical representation of the operation of a matrix of pixels for a plurality of frames, according to at least one aspect of the present description. [0052] [0052] Figure 23D illustrates a schematic representation of an example of a sequence of operation of chrominance and luminance frames, in accordance with at least one aspect of the present description. [0053] [0053] Figure 23E illustrates an example of sensor and emitter patterns, in accordance with at least one aspect of the present description. [0054] [0054] Figure 23F illustrates a graphical representation of the operation of a matrix of pixels, according to at least one aspect of the present description. [0055] [0055] Figure 24 illustrates a schematic representation of an example of instrumentation for NIR spectroscopy, according to an aspect of the present description. [0056] [0056] Figure 25 schematically illustrates an example of instrumentation for determining NIRS based on Fourier transform infrared imaging, in accordance with at least one aspect of the present description. [0057] [0057] Figures 26A to 26C illustrate a change in the wavelength of light scattered from the movement of blood cells, in accordance with at least one aspect of the present description. [0058] [0058] Figure 27 illustrates an aspect of the instrumentation that can be used to detect a Doppler effect in laser light scattered from portions of a tissue, in accordance with at least one aspect of the present description. [0059] [0059] Figure 28 schematically illustrates some optical effects on the light that falls on a fabric that has subsurface structures, according to at least one aspect of the present description. [0060] [0060] Figure 29 illustrates an example of the effects on a Doppler analysis of light that falls on a tissue sample that has subsurface structures, in accordance with at least one aspect of the present description. [0061] [0061] Figures 30A to 30C schematically illustrate the detection of blood cells in motion in a tissue depth based on a laser Doppler analysis at a variety of laser wavelengths, in accordance with at least one aspect of this description. [0062] [0062] Figure 30D illustrates the effect of lighting a CMOS imaging sensor with a plurality of wavelengths of light over time, in accordance with at least one aspect of the present description. [0063] [0063] Figure 31 illustrates an example of a use of Doppler imaging to detect the presence of subsurface blood vessels, in accordance with at least one aspect of the present description. [0064] [0064] Figure 32 illustrates a method for identifying a subsurface blood vessel based on a Doppler effect of blue light due to blood cells flowing through them, in accordance with at least one aspect of the present description. [0065] [0065] Figure 33 schematically illustrates the location of a deep subsurface blood vessel, in accordance with at least one aspect of the present description. [0066] [0066] Figure 34 schematically illustrates the location of a blood vessel of superficial subsurface, in accordance with at least one aspect of the present description. [0067] [0067] Figure 35 illustrates a composite image comprising a surface image and an image of a subsurface blood vessel, in accordance with at least one aspect of the present description. [0068] [0068] Figure 36 is a flow chart of a method for determining a depth of a surface feature in a piece of fabric, in accordance with at least one aspect of the present description. [0069] [0069] Figure 37 illustrates the effect of the location and characteristics of non-vascular structures on the light falling on a tissue sample, according to at least one aspect of the present description. [0070] [0070] Figure 38 schematically represents an example of components used in a full-field TCO device, in accordance with at least one aspect of the present description. [0071] [0071] Figure 39 illustrates schematically the effect of anomalies in the tissue in the reflected light from a tissue sample, according to at least one aspect of the present description. [0072] [0072] Figure 40 illustrates a display of the image derived from a combination of tissue visualization modalities, in accordance with at least one aspect of the present description. [0073] [0073] Figures 41A to 41C illustrate various aspects of displays that can be provided to a surgeon for visual identification of a combination of tissue surface and subsurface structures at a surgical site, in accordance with at least one aspect of the this description. [0074] [0074] Figure 42 is a flow chart of a method for providing information related to a tissue characteristic for an intelligent surgical instrument, in accordance with at least one aspect of the present description. [0075] [0075] Figures 43A and 43B illustrate a light sensor with multiple pixels receiving light reflected by a fabric illuminated by sequential exposure to red, green, blue and infrared light, and red, green, blue and ultraviolet laser light sources. respectively, in accordance with at least one aspect of the present description. [0076] [0076] Figures 44A and 44B illustrate the distal end of an elongated camera probe with a single light sensor and two light sensors, respectively, according to at least one aspect of the present description. [0077] [0077] Figure 44C illustrates a perspective view of an example of a monolithic sensor that has a plurality of pixel arrays, in accordance with at least one aspect of the present description. [0078] [0078] Figure 45 illustrates an example of a pair of fields of view available for two image sensors of an elongated camera probe, according to at least one aspect of the present description. [0079] [0079] Figures 46A to 46D illustrate additional examples of a pair of fields of view available for two image sensors from an elongated camera probe, in accordance with at least one aspect of the present description. [0080] [0080] Figures 47A to 47C illustrate an example of using an imaging system that incorporates the features disclosed in Figure 46D, in accordance with at least one aspect of the present description. [0081] [0081] Figures 48A and 48B illustrate another example of using a dual imaging system, in accordance with at least one aspect of the present description. [0082] [0082] Figures 49A to 49C illustrate examples of a sequence of surgical steps that can benefit from the use of multiple image analysis at the surgical site, in accordance with at least one aspect of the present description. [0083] [0083] Figure 50 is a timeline representing the situational recognition of a central surgical controller, in accordance with at least one aspect of the present description. DESCRIPTION [0084] [0084] The applicant for this application holds the following provisional US patent applications, filed on March 28, 2018, each of which is incorporated herein by reference in its entirety: [0085] [0085] and US provisional patent application serial number 62 / 649,302, entitled INTERACTIVE SURGICAL SYSTEMS WITH ENCRYPTED COMMUNICATION CAPABILITIES; [0086] [0086] and US Provisional Patent Application Serial No. 62 / 649,294, entitled DATA STRIPPING METHOD TO INTERROGATE PATIENT RECORDS AND CREATE ANONYMIZED RECORD; [0087] [0087] and US Provisional Patent Application Serial No. 62 / 649,300, entitled SURGICAL HUB SITUATIONAL AWARENESS; [0088] [0088] and US Provisional Patent Application Serial No. 62 / 649,309, entitled SURGICAL HUB SPATIAL AWARENESS TO DETERMINE DEVICES IN OPERATING THEATER; [0089] [0089] and US Provisional Patent Application Serial No. 62 / 649,310, entitled COMPUTER IMPLEMENTED INTERACTIVE SURGICAL SYSTEMS; [0090] [0090] and US provisional patent application serial number 62 / 649,291, entitled USE OF LASER LIGHT AND RED-GREEN-BLUE COLORATION TO DETERMINE PROPERTIES OF BACK SCATTERED LIGHT; [0091] [0091] and US Provisional Patent Application Serial No. 62 / 649,296, entitled ADAPTIVE CONTROL PROGRAM UPDATES FOR SURGICAL DEVICES; [0092] [0092] and US Provisional Patent Application Serial No. 62 / 649,333, entitled CLOUD-BASED MEDICAL ANALYTICS FOR CUSTOMIZATION AND RECOMMENDATIONS TO A USER; [0093] [0093] and Provisional Patent Application US Serial No. 62 / 649,327, entitled CLOUD-BASED MEDICAL ANALYTICS FOR SECURITY AND AUTHENTICATION TRENDS AND REACTIVE MEASURES; [0094] [0094] and US provisional patent application serial number 62 / 649,315, entitled DATA HANDLING AND PRIORITIZATION IN A CLOUD ANALYTICS NETWORK; [0095] [0095] and US provisional patent application serial number 62 / 649,313, entitled CLOUD INTERFACE FOR COUPLED SURGICAL DEVICES; [0096] [0096] and US Provisional Patent Application Serial No. 62 / 649,320, entitled - DRIVE — ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; [0097] [0097] and US Provisional Patent Application Serial No. 62 / 649,307, entitled AUTOMATIC TOOL ADJUSTMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; and [0098] [0098] and US provisional patent application serial number 62 / 649,323, entitled SENSING. ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS. [0099] [0099] The applicant for this application holds the following US patent applications, filed on March 29, 2018, each of which is incorporated herein by reference in its entirety: [00100] [00100] and US patent application serial number, entitled INTERACTIVE - SURGICAL - SYSTEMS WITH —ENCRYPTED COMMUNICATION CAPABILITIES; Attorney document number [00101] [00101] and US patent application serial number, entitled [00102] [00102] and US patent application serial number, entitled [00103] [00103] and US patent application serial number, entitled SPATIAL AWARENESS OF SURGICAL HUBS IN OPERATING ROOMS; Attorney document number END8499USNP3 / 170766-3; [00104] [00104] and US patent application serial number, entitled [00105] [00105] and US patent application serial number, entitled SURGICAL HUB CONTROL ARRANGEMENTS; Attorney document number END8499USNP5 / 170766-5; [00106] [00106] and US patent application serial number, entitled [00107] [00107] and US patent application serial number, entitled COMMUNICATION HUB AND STORAGE DEVICE FOR [00108] [00108] and US patent application serial number, [00109] [00109] and US patent application serial number, entitled DATA PAIRING TO INTERCONNECT A DEVICE MEASURED PARAMETER WITH AN OUTCOME; Attorney document number END8500USNP3 / 170767-3; [00110] [00110] and US patent application serial number; titled SURGICAL HUB SITUATIONAL AWARENESS; Attorney document number END8501USNP / 170768; [00111] [00111] and US patent application serial number, entitled SURGICAL SYSTEM DISTRIBUTED PROCESSING; Attorney document number END8501USNP1 / 170768-1; [00112] [00112] and US patent application serial number, entitled AGGREGATION AND REPORTING OF SURGICAL HUB DATA; Attorney document number END8501USNP2 / 170768-2; [00113] [00113] and US patent application serial number, entitled [00114] [00114] and US patent application serial number; titled DISPLAY OF ALIGNMENT OF STAPLE CARTRIDGE TO PRIOR LINEAR STAPLE LINE; Attorney document number END8502USNP1 / 170769-1; [00115] [00115] and US patent application serial number, entitled STERILE FIELD INTERACTIVE CONTROL DISPLAYS; Attorney document number END8502USNP2 / 170769-2; [00116] [00116] and US patent application serial number, entitled COMPUTER IMPLEMENTED INTERACTIVE SURGICAL SYSTEMS; Attorney document number END8503USNP / 170770; [00117] [00117] and US patent application serial number, entitled USE OF LASER LIGHT AND RED-GREEN-BLUE COLORATION TO DETERMINE PROPERTIES OF BACK SCATTERED LIGHT; Attorney document number END8504USNP / 170771; and [00118] [00118] and US patent application serial number; titled CHARACTERIZATION OF TISSUE IRREGULARITIES THROUGH THE USE OF MONO-CHROMATIC LIGHT REFRACTIVITY; Attorney document number END8504USNP1 / 170771-1. [00119] [00119] The applicant for this application holds the following US patent applications, filed on March 29, 2018, each of which is incorporated herein by reference in its entirety: [00120] [00120] and US patent application serial number, entitled ADAPTIVE CONTROL PROGRAM UPDATES FOR SURGICAL DEVICES; Attorney document number END8506USNP / 170773; [00121] [00121] and US patent application serial number, entitled ADAPTIVE CONTROL PROGRAM UPDATES FOR SURGICAL HUBS; Attorney document number END8506USNP1 / 170773-1; [00122] [00122] and US patent application serial number, entitled CLOUD-BASED MEDICAL ANALYTICS FOR CUSTOMIZATION AND RECOMMENDATIONS TO A USER; Attorney document number END8507 USNP / 170774; [00123] [00123] and US patent application serial number, entitled CLOUD-BASED MEDICAL ANALYTICS FOR LINKING OF LOCAL [00124] [00124] and US patent application serial number, entitled CLOUD-BASED MEDICAL ANALYTICS FOR MEDICAL FACILITY [00125] [00125] and US patent application serial number, entitled CLOUD-BASED MEDICAL ANALYTICS FOR SECURITY AND AUTHENTICATION TRENDS AND REACTIVE MEASURES; Attorney document number END8508USNP / 170775; [00126] [00126] and US patent application serial number, entitled DATA HANDLING AND PRIORITIZATION IN A CLOUD ANALYTICS NETWORK; Attorney document number END8509USNP / 170776; and [00127] [00127] and US patent application serial number; titled CLOUD INTERFACE FOR COUPLED SURGICAL DEVICES; Attorney document number END8510USNP / 170777. [00128] [00128] The applicant for this application holds the following US patent applications, filed on March 29, 2018, each of which is incorporated herein by reference in its entirety: [00129] [00129] and US patent application serial number, entitled DRIVE ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; Attorney document number END8511USNP / 170778; [00130] [00130] and US patent application serial number, entitled COMMUNICATION - ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; Attorney document number END8511USNP1 / 170778-1; [00131] [00131] and US patent application serial number, entitled CONTROLS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; Attorney document number END8511USNP2 / 170778-2; [00132] [00132] and US patent application serial number, entitled AUTOMATIC. TOOL ADJUSTMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; Attorney document number [00133] [00133] and US patent application serial number, entitled CONTROLLERS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; Attorney document number END8512USNP1 / 170779-1; [00134] [00134] and US patent application serial number, entitled COOPERATIVE SURGICAL ACTIONS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; Attorney document number END8512USNP2 / 170779-2; [00135] [00135] and US patent application serial number, entitled DISPLAY ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; Attorney document number END8512USNP3 / 170779-3; and [00136] [00136] and US patent application serial number, entitled SENSING ARRANGEMENTS FOR ROBOT-ASSISTED SURGICAL PLATFORMS; Attorney document number END8513USNP / 170780. [00137] [00137] Before explaining in detail the various aspects of surgical instruments and generators, it should be noted that the illustrative examples are not limited, in terms of application or use, to the details of construction and arrangement of parts illustrated in the drawings and description attached. Illustrative examples can be implemented or incorporated into other aspects, variations and modifications, and can be practiced or performed in a variety of ways. Furthermore, except where otherwise indicated, the terms and expressions used in the present invention were chosen for the purpose of describing illustrative examples for the convenience of the reader and not for the purpose of limiting it. In addition, it should be understood that one or more of the aspects, expressions of aspects, and / or examples described below can be combined with any one or more of the other aspects, expressions of aspects and / or examples described below. [00138] [00138] Referring to Figure 1, a computer-implemented interactive surgical system 100 includes one or more surgical systems 102 and a cloud-based system (for example, cloud 104 which may include a remote server 113 coupled to a storage device 105). Each surgical system 102 includes at least one central surgical controller 106 in communication with the cloud 104 which can include a remote server 113. In one example, as illustrated in Figure 1, surgical system 102 includes a visualization system 108, a robotic system 110, a smart handheld surgical instrument 112, which are configured to communicate with one another and / or the central controller 106. In some respects, a surgical system 102 may include a number of central controllers M 106, an N number of visualization systems 108, an O number of robotic systems 110, and a P number of smart, hand-held surgical instruments 112, where M, N, O, and P are whole numbers greater than or equal to one. [00139] [00139] Figure 3 represents an example of a surgical system 102 being used to perform a surgical procedure on a patient who is lying on an operating table 114 in a surgical operating room 116. A robotic system 110 is used in the surgical procedure as part of the surgical system 102. The robotic system 110 includes a surgeon console 118, a patient car 120 (surgical robot), and a robotic central surgical controller 122. The patient car 120 can handle at least one attached surgical tool removably 117 through a minimally invasive incision in the patient's body while the surgeon views the surgical site through the surgeon's console 118. An image of the surgical site can be obtained by a medical imaging device 124, which can be manipulated by car 120 to guide imaging device 124. Robotic central surgical controller 122 can be used to process the surgical site for subsequent display to the surgeon through the surgeon's console 118. [00140] [00140] Other types of robotic systems can readily be adapted for use with the surgical system 102. Various examples of robotic systems and surgical instruments that are suitable for use with the present description are described in provisional patent application serial number 62 / 611.339 , entitled ROBOT ASSISTED SURGICAL PLATFORM, filed on December 28, 2017, whose description is hereby incorporated by reference in its entirety for reference. [00141] [00141] Several examples of cloud-based analysis that are performed by cloud 104, and are suitable for use with the present description, are described in US provisional patent application serial number 62 / 611.340, entitled CLOUD-BASED MEDICAL ANALYTICS, filed on December 28, 2017, the description of which is incorporated herein by reference, in its entirety. [00142] [00142] In several respects, the imaging device 124 includes at least one Image sensor and one or more optical components. Suitable image sensors include, but are not limited to, load-coupled device (CCD) sensors and complementary metal oxide semiconductor (CMOS) sensors. [00143] [00143] The optical components of the imaging device 124 may include one or more light sources and / or one or more lenses. One or more light sources can be directed to illuminate portions of the surgical field. The one or more image sensors can receive reflected or refracted light from the surgical field, including reflected or refracted light from tissue and / or surgical instruments. [00144] [00144] The one or more light sources can be configured to radiate electromagnetic energy in the visible spectrum, as well as in the invisible spectrum. The visible spectrum, sometimes called the optical spectrum or light spectrum, is that portion of the electromagnetic spectrum that is visible to (that is, can be detected by) the human eye and can be called visible light or simply light. A typical human eye will respond to wavelengths in the air that are from about 380 nm to about 750 nm. [00145] [00145] The invisible spectrum (that is, the non-luminous spectrum) is that portion of the electromagnetic spectrum located below and above the visible spectrum (that is, wavelengths below about 380 nm and above about 750 nm). The invisible spectrum is not detectable by the human eye. Wavelengths greater than about 750 nm are longer than the visible red spectrum, and they become invisible infrared (IR), microwaves, radio and electromagnetic radiation. Wavelengths shorter than about 380 nm are shorter than the ultraviolet spectrum, and they become invisible ultraviolet, x-ray, and electromagnetic gamma-ray radiation. [00146] [00146] In several respects, the imaging device 124 is configured for use in a minimally invasive procedure. Examples of imaging devices suitable for use with the present description include, but are not limited to, an arthroscope, angioscope, bronchoscope, choledocoscope, colonoscope, cytoscope, duodenoscope, enteroscope, esophagus-duodenoscope (gastroscope), endoscope, laryngoscope, nasopharyngoscope neproscope, sigmoidoscope, thoracoscope, and ureteroscope. [00147] [00147] In one aspect, the imaging device uses multiple spectrum monitoring to discriminate topography and underlying structures. A multispectral image is one that captures image data within wavelength bands across the electromagnetic spectrum. Wavelengths can be separated by filters or using instruments that are sensitive to specific wavelengths, including light from frequencies beyond the visible light range, for example, IR and ultraviolet light. Spectral images can allow the extraction of additional information that the human eye cannot capture with its receivers for the colors red, green, and blue. The use of multispectral imaging is described in greater detail under the heading "Advanced Imaging Acquisition Module" in US provisional patent application serial number 62 / 611,341, entitled INTERACTIVE SURGICAL PLATFORM, filed on December 28, 2017, the description of which is incorporated herein as a reference in its entirety. Multispectral monitoring can be a useful tool for relocating a surgical field after a surgical task is completed to perform one or more of the tests previously described on the treated tissue. [00148] [00148] It is axiomatic that strict sterilization of the operating room and surgical equipment is necessary during any surgery. The strict hygiene and sterilization conditions required in an "operating room", that is, an operating or treatment room, justify the highest possible sterilization of all medical devices and equipment. Part of this sterilization process is the need to sterilize anything that comes into contact with the patient or enters the sterile field, including imaging device 124 and its connectors and components. It will be understood that the sterile field can be considered a specified area, such as inside a tray or on a sterile towel, which is considered free of microorganisms, or the sterile field can be considered an area, immediately around a patient, who was prepared to perform a surgical procedure. The sterile field may include members of the brushing team, who are properly dressed, and all furniture and accessories in the area. [00149] [00149] In several aspects, the visualization system 108 includes one or more imaging sensors, one or more image processing units, one or more storage matrices and one or more screens that are strategically arranged in relation to the sterile field, as shown in Figure 2. In one aspect, the display system 108 includes an interface for HL7, PACS and EMR. Various components of the visualization system 108 are described under the heading "Advanced Imaging Acquisition Module" in US provisional patent application serial number 62 / 611.341, entitled INTERACTIVE SURGICAL PLATFORM, filed on December 28, 2017, the description of which is incorporated here reference title in its entirety. [00150] [00150] As shown in Figure 2, a primary screen 119 is positioned in the sterile field to be visible to the operator on the operating table 114. In addition, a viewing tower 111 is positioned outside the sterile field. The display tower 111 includes a first non-sterile screen 107 and a second non-sterile screen 109, which are opposite each other. The visualization system 108, guided by the central controller 106, is configured to use screens 107, 109, and 119 to coordinate the flow of information to operators inside and outside the sterile field. For example, the central controller 106 can have the visualization system 108 display a snapshot of a surgical site, as recorded by an imaging device 124, on a non-sterile screen 107 or 109, while maintaining a live transmission of the surgical site on primary screen 119. Snapshot on non-sterile screen 107 or 109 may allow a non-sterile operator to perform a diagnostic step relevant to the surgical procedure, for example. [00151] [00151] In one aspect, central controller 106 is also configured to route a diagnostic input or feedback by a non-sterile operator in the display tower 111 to the primary screen 119 within the sterile field, where it can be seen by a sterile operator on the operating table. In one example, the entry may be in the form of a modification of the snapshot displayed on the non-sterile screen 107 or 109, which can be routed to primary screen 119 by central controller 106. [00152] [00152] With reference to Figure 2, a 112 surgical instrument is being used in the surgical procedure as part of the surgical system [00153] [00153] Now with reference to Figure 3, a central controller 106 is shown in communication with a visualization system 108, a robotic system 110 and a smart handheld surgical instrument 112. Central controller 106 includes a central controller screen 135, an imaging module 138, a generator module 140, a communication module 130, a processor module 132 and a storage matrix 134. In certain respects, as shown in Figure 3, central controller 106 also includes a smoke evacuation module 126 and / or a suction / irrigation module 128. [00154] [00154] During a surgical procedure, the application of energy to the tissue, for sealing and / or cutting, is generally associated with the evacuation of smoke, suction of excess fluid and / or irrigation of the tissue. Fluid, power, and / or data lines from different sources are often intertwined during the surgical procedure. Valuable time can be wasted in addressing this issue during a surgical procedure. To untangle the lines, it may be necessary to disconnect the lines from their respective modules, which may require a restart of the modules. The central compartment of the central controller 136 offers a unified environment for managing power, data and fluid lines, which reduces the frequency of entanglement between such lines. [00155] [00155] Aspects of the present description feature a central surgical controller for use in a surgical procedure that involves applying energy to the tissue at a surgical site. The central surgical controller includes a central controller compartment and a combination generator module received slidingly at a central controller compartment docking station. The docking station includes data and power contacts. The combined generator module includes two or more of an ultrasonic energy generating component, a bipolar RF energy generating component, and a monopolar RF energy generating component which are housed in a single unit. In one aspect, the combined generator module also includes a smoke evacuation component, at least one power application cable to connect the combined generator module to a surgical instrument, at least one smoke evacuation component configured to evacuate smoke, fluid , and / or particulates generated by applying therapeutic energy to the tissue, and a fluid line that extends from the remote surgical site to the smoke evacuation component. [00156] [00156] In one aspect, the fluid line is a first fluid line and a second fluid line extends from the remote surgical site to a suction and irrigation module received slidingly in the central controller compartment. In one aspect, the central controller compartment comprises a fluid interface. [00157] [00157] Certain surgical procedures may require the application of more than one type of energy to the tissue. One type of energy may be more beneficial for cutting the fabric, while another type of energy may be more beneficial for sealing the fabric. For example, a bipolar generator can be used to seal the tissue while an ultrasonic generator can be used to cut the sealed tissue. Aspects of the present description present a solution in which a modular compartment of the central controller 136 is configured to accommodate different generators and facilitate interactive communication between them. One of the advantages of the central modular compartment 136 is that it allows quick removal and / or replacement of several modules. [00158] [00158] Aspects of the present description present a modular surgical compartment for use in a surgical procedure that involves applying energy to the tissue. The modular surgical compartment includes a first energy generating module, configured to generate a first energy for application to the tissue, and a first docking station that comprises a first docking port that includes the first data and power contacts, the first power generator module is slidingly movable in an electrical coupling with the data and power contacts, [00159] [00159] In addition to the above, the modular surgical compartment also includes a second energy generator module configured to generate a second energy, different from the first energy, for application to the tissue, and a second docking station comprising a second docking port which includes second data and power contacts, the second power generator module being slidably movable in an electrical coupling with the power and data contacts, and the second power generator module is slidingly movable out of the electrical coupling with the second power and data contacts. [00160] [00160] In addition, the modular surgical compartment also includes a communication bus between the first coupling port and the second coupling port, configured to facilitate communication between the first energy generating module and the second energy generating module. [00161] [00161] With reference to Figures 3 to 7, aspects of the present description are presented for a modular compartment of the central controller 136 that allows the modular integration of a generator module 140, a smoke evacuation module 126 and a suction / irrigation module 128. The central controller 136 modular compartment further facilitates interactive communication between modules 140, 126, 128. As illustrated in Figure 5, generator module 140 can be a generator module with integrated monopolar, bipolar and ultrasonic components, supported in a single cabinet unit 139 slidably insertable into the central modular compartment 136. As shown in Figure 5, generator module 140 can be configured to connect to a monopolar device 146, a bipolar device 147 and an ultrasonic device 148. Alternatively, generator module 140 may comprise a series of monopolar, bipolar and / or ultrasonic generator modules that interact through the central modular compartment 136. The central modular compartment 136 can be configured to facilitate the insertion of multiple generators and interactive communication between the generators anchored in the central modular compartment 136 so that the generators would act as a single generator. [00162] [00162] In one aspect, the central modular compartment 136 comprises a modular power and a back communication board 149 with external and wireless communication heads to allow the removable fixing of modules 140, 126, 128 and interactive communication between them. [00163] [00163] In one aspect, the central modular compartment 136 includes docking stations, or drawers, 151, here also called drawers, which are configured to receive modules 140, 126, 128 in a sliding manner. Figure 4 illustrates a view in partial perspective of a surgical compartment of the central surgical controller 136, and a combined generator module 145 received slidingly in a docking station 151 of the central surgical controller compartment 136. A docking port 152 with power and data contacts in one rear side of the combined generator module 145 is configured to engage a corresponding docking port 150 with the power and data contacts of a corresponding docking station 151 from the central controller modular housing 136 as the combined generator module 145 is slid into position corresponding docking station 151 of the central controller 136 modular compartment. In one aspect, the module The combined generator 145 includes a bipolar, ultrasonic and monopolar module and a smoke evacuation module integrated in a single compartment unit 139, as shown in Figure 5. [00164] [00164] In several respects, the smoke evacuation module 126 includes a fluid line 154 that carries captured / collected fluid smoke away from a surgical site and to, for example, the smoke evacuation module 126. Suction a vacuum that originates from the smoke evacuation module 126 can pull the smoke into an opening of a utility conduit at the surgical site. The utility conduit, coupled to the fluid line, can be in the form of a flexible tube that ends in the smoke evacuation module 126. The utility conduit and the fluid line define a fluid path that extends towards the smoke evacuation module 126 which is received in the central controller compartment 136. [00165] [00165] In several aspects, the suction / irrigation module 128 is coupled to a surgical tool comprising a fluid suction line and a fluid suction line. In one example, the suction and suction fluid lines are in the form of flexible tubes that extend from the surgical site towards the suction / irrigation module 128. One or more drive systems can be configured to cause irrigation and aspiration of fluids to and from the surgical site. [00166] [00166] In one aspect, the surgical tool includes a drive shaft that has an end actuator at a distal end of the same and at least an energy treatment associated with the end actuator, a suction tube, and a suction tube. irrigation. The suction tube can have an inlet port at a distal end of it and the suction tube extends through the drive shaft. Similarly, an irrigation pipe can extend through the drive shaft and may have an entrance port close to the power application implement. The power application implement is configured to deliver ultrasonic and / or RF energy to the surgical site and is coupled to the generator module 140 by a cable that initially extends through the drive shaft. [00167] [00167] The irrigation tube can be in fluid communication with a fluid source, and the suction tube can be in fluid communication with a vacuum source. The fluid source and / or the vacuum source can be housed in the suction / irrigation module 128. In one example, the fluid source and / or the vacuum source can be housed in the central controller compartment 136 separately from the control module. suction / irrigation 128. In such an example, a fluid interface can be configured to connect the suction / irrigation module 128 to the fluid source and / or the vacuum source. [00168] [00168] In one aspect, modules 140, 126, 128 and / or their corresponding docking stations in the central modular compartment 136 may include alignment features that are configured to align the docking ports of the modules in engagement with their counterparts at the docking stations. coupling of the central modular compartment 136. For example, as shown in Figure 4, the combined generator module 145 includes side brackets 155 that are configured to slide the corresponding brackets 156 of the corresponding docking station 151 of the central modular compartment 136 slidably. brackets cooperate to guide the coupling port contacts of the combined generator module 145 in an electrical coupling with the coupling port contacts of the central modular compartment 136. [00169] [00169] In some respects, the drawers 151 of the central modular compartment 136 are the same, or substantially the same size, and the modules are adjusted in size to be received in the drawers 151. For example, the side brackets 155 and / or 156 can be larger or smaller depending on the size of the module. In other respects, drawers 151 are different in size and are each designed to accommodate a specific module. [00170] [00170] In addition, the contacts of a specific module can be switched to engage with the contacts of a specific drawer to avoid inserting a module in a drawer with unpaired contacts. [00171] [00171] As shown in Figure 4, the coupling port 150 of a drawer 151 can be coupled to the coupling port 150 of another drawer 151 via a communication link 157 to facilitate interactive communication between the modules housed in the modular compartment central 136. The coupling ports 150 of the central modular compartment 136 can, alternatively or even facilitate interactive wireless communication between the modules housed in the central modular compartment 136. Any suitable wireless communication can be used, such as Air Titan Bluetooth. [00172] [00172] Figure 6 illustrates individual power bus connectors for a plurality of side coupling ports of a side modular compartment 160 configured to receive a plurality of modules from a central surgical controller 206. Side modular compartment 160 is configured to receive and laterally interconnect modules 161. Modules 161 are slidably inserted into docking stations 162 of side modular compartment 160, which includes a back plate for interconnecting modules 161. As shown in Figure 6, modules 161 are arranged laterally in the side modular cabinet 160. Alternatively, modules 161 can be arranged vertically in a side modular cabinet. [00173] [00173] Figure 7 illustrates a vertical modular cabinet 164 configured to receive a plurality of modules 165 from the central surgical controller 106. The modules 165 are slidably inserted into docking stations, or drawers, 167 of the vertical modular cabinet 164, the which includes a rear panel for interconnecting modules 165. Although the drawers 167 of the vertical modular cabinet 164 are arranged vertically, in certain cases, a vertical modular cabinet 164 may include drawers that are arranged laterally. In addition, modules 165 can interact with each other through the coupling ports of the vertical modular cabinet 164. In the example in Figure 7, a screen 177 is provided to show data relevant to the operation of modules 165. In addition, the vertical modular compartment 164 includes a master module 178 which houses a plurality of submodules that are received slidingly in the master module 178. [00174] [00174] In several respects, the imaging module 138 comprises an integrated video processor and a modular light source and is adapted for use with various imaging devices. In one aspect, the imaging device is comprised of a modular compartment that can be mounted with a light source module and a camera module. The compartment can be a disposable compartment. In at least one example, the disposable compartment is removably coupled to a reusable controller, a light source module, and a camera module. The light source module and / or the camera module can be selected selectively depending on the type of surgical procedure. In one aspect, the camera module comprises a CCD sensor. In another aspect, the camera module comprises a CMOS sensor. In another aspect, the camera module is configured for imaging the scanned beam. Similarly, the light source module can be configured to provide a white light or a different light, depending on the surgical procedure. [00175] [00175] During a surgical procedure, removing a surgical device from the surgical field and replacing it with another surgical device that includes a different camera or other light source may be inefficient. Temporarily losing sight of the surgical field can lead to undesirable consequences. The imaging device module of the present description is configured to allow the replacement of a light source module or a "midstream" camera module during a surgical procedure, without the need to remove the imaging device from the surgical field. [00176] [00176] In one aspect, the imaging device comprises a tubular compartment that includes a plurality of channels. A first channel is configured to receive the Camera module in a sliding way, which can be configured for a snap-fit fit (pressure fit) with the first channel. A second channel is configured to slide the camera module, which can be configured for a snap-fit fit (pressure fit) with the first channel. In another example, the camera module and / or the light source module can be rotated to an end position within their respective channels. A threaded coupling can be used instead of a pressure fitting. [00177] [00177] In several examples, multiple imaging devices are placed in different positions in the surgical field to provide multiple views. Imaging module 138 can be configured to switch between imaging devices to provide an ideal view. In several respects, imaging module 138 can be configured to integrate images from different imaging devices. [00178] [00178] Various image processors and imaging devices suitable for use with the present description are described in US patent No. 7,995,045 entitled COMBINED SBI AND CONVENTIONAL IMAGE PROCESSOR, granted on August 9, 2011 which is incorporated herein by reference in its entirety. In addition, US patent No. 7,982,776, entitled SBI MOTION ARTIFACT REMOVAL APPARATUS AND METHOD, issued on July 19, 2011, which is incorporated herein by reference in its entirety, describes various systems for removing motion artifacts from the data of image. Such systems can be integrated with the imaging module 138. In addition to these, the publication of US patent application No. 2011/0306840, entitled CONTROLLABLE [00179] [00179] Figure 8 illustrates a surgical data network 201 comprising a central modular communication controller 203 configured to connect modular devices located in one or more operating rooms of a healthcare facility, or any environment in a healthcare facility. audiences specially equipped for surgical operations, to a cloud-based system (for example, cloud 204 which may include a remote server 213 coupled to a storage device 205). In one aspect, the modular communication central controller 203 comprises a central network controller 207 and / or a network key 209 in communication with a network router. The central modular communication controller 203 can also be coupled to a local computer system 210 to provide local computer processing and data manipulation. The surgical data network 201 can be configured as a passive, intelligent, or switching network. A passive surgical data network serves as a conduit for the data, allowing the data to be transmitted from one device (or segment) to another and to cloud computing resources. An intelligent surgical data network includes features to allow traffic to pass through the surgical data network to be monitored and to configure each port on the central network controller 207 or network key 209. An intelligent surgical data network can be called a a central controller or controllable key. A central switching controller reads the destination address of each packet and then forwards the packet to the correct port. [00180] [00180] The modular devices 1a to 1n located in the operating room can be coupled to the central controller of modular communication 203. The central network controller 207 and / or the network switch 209 can be coupled to a network router 211 to connect devices 1a to 1h to the 204 cloud or the local computer system [00181] [00181] It will be understood that the surgical data network 201 can be expanded by interconnecting multiple central network controllers 207 and / or multiple network keys 209 with multiple network routers 211. The modular communication center 203 may be contained in a modular control roaster configured to receive multiple devices 1a to 1hn / 2a to 2m. The local computer system 210 can also be contained in a modular control tower. The modular communication center 203 is connected to a screen 212 to display the images obtained by some of the devices 1a to 1n / 2a to 2m, for example, during surgical procedures. In several respects, devices 1a to 1n / 2a to 2m can include, for example, several modules such as an imaging module 138 coupled to an endoscope, a generator module 140 coupled to an energy-based surgical device, an evacuation module smoke 126, a suction / irrigation module 128, a communication module 130, a processor module 132, a storage matrix 134, a surgical device attached to a screen, and / or a non-contact sensor module, among others modular devices that can be connected to the modular communication center 203 of the surgical data network 201. [00182] [00182] In one aspect, the surgical data network 201 may comprise a combination of central network controllers, network switches, and network routers that connect devices 1a to 1n / 2a to 2m to the cloud. Any or all of the devices 1a to 1n / 2a to 2m coupled to the central network controller or network key can collect data in real time and transfer the data to cloud computers for data processing and manipulation. It will be understood that cloud computing depends on sharing computing resources instead of having local servers or personal devices to handle software applications. The word "cloud" can be used as a metaphor for "the Internet, although the term is not limited as such. Consequently, the term" cloud computing "can be used here to refer to" a type of Internet-based computing ", in which different services - such as servers, storage, and applications - are applied to the modular communication center 203 and / or computer system 210 located in the operating room (for example, a fixed, mobile, temporary room or space, or field of operation) and devices connected to the modular communication center 203 and / or computer system 210 over the Internet. The cloud infrastructure can be maintained by a cloud service provider. In this context, the cloud service provider may be the entity that coordinates the use and control of devices 1a to 1n / 2a to 2m located in one or more operating rooms. Cloud computing services can perform a large number of calculations based on the data cole by smart surgical instruments, robots, and other computerized devices located in the operating room. The central controller hardware allows multiple devices or connections to be connected to a computer that communicates with cloud computing and storage resources. [00183] [00183] The application of cloud computer data processing techniques in the data collected by devices 1a to 1n / 2a to 2m, the surgical data network provides better surgical results, reduced costs, and better patient satisfaction. At least some of the devices 1a to 1n / 2a to 2m can be used to view tissue status to assess leakage or perfusion of sealed tissue after a tissue sealing and cutting procedure. At least some of the devices 1a to [00184] [00184] In an implementation, the devices in the operating room 1a to 1h can be connected to the modular communication center 203 via a wired channel or a wireless channel depending on the configuration of the devices 1a to 1h in a central network controller. The central network controller 207 can be implemented, in one aspect, as a local network transmission device that acts on the physical layer of the OSI model ("open system interconnection"). The central network controller provides connectivity to devices 1a to 1n located on the same network as the operating room. The central network controller 207 collects data in the form of packets and sends it to the router in "half-duplex" mode. The central network controller 207 does not store any media / Internet protocol (MAC / IP) access control for transferring data from the device. Only one of the devices 1a to 1n at a time can send data through the central network controller 207. The network controller 207 does not have routing tables or intelligence about where to send information and transmits all network data through each connection and the a remote server 213 (Figure 9) in the cloud 204. Network controller 207 can detect basic network errors, such as collisions, but having all (admit that) information transmitted to multiple input ports can be a security risk and cause bottlenecks. [00185] [00185] In another implementation, operating room devices 2a to 2m can be connected to a network switch 209 through a wired or wireless channel. The network key 209 works in the data connection layer of the OSI model. The network switch 209 is a multicast device for connecting devices 2a to 2m located in the same operation center to the network. The network key 209 sends data in frame form to the network router 211 and works in full duplex mode. Multiple devices 2a to 2m can send data at the same time via network key 209. The network key 209 stores and uses MAC addresses of devices 2a to 2m to transfer data. [00186] [00186] The central network controller 207 and / or the network key 209 are coupled to the network router 211 for a connection to the cloud [00187] [00187] In one example, the central network controller 207 can be implemented as a central USB controller, which allows multiple USB devices to be connected to a host computer. The central USB controller can expand a single USB port on several levels so that more ports are available to connect the devices to the system's host computer. The central network controller 207 can include wired or wireless capabilities to receive information about a wired channel or a wireless channel. In one aspect, a wireless wireless, broadband and short-range wireless USB communication protocol can be used for communication between devices 1a to 1n and devices 2a to 2m located in the operating room. [00188] [00188] In other examples, devices in the operating room 1a to 1n / 2a to 2m can communicate with the modular communication center 203 via standard Bluetooth wireless technology for exchanging data over short distances (with the use of short wavelength UHF radio waves in the 2.4 to 2.485 GHz ISM band) from fixed and mobile devices and building personal area networks (PANs, [00189] [00189] The modular communication hub 203 can serve as a central connection for one or all operating room devices 1a to 1n / 2a to 2m and handles a data type known as frames. The tables carry the data generated by the devices 1a to 1n / 2a to 2m. When a frame is received by the modular communication center 203, it is amplified and transmitted to the network router 211, which transfers the data to the cloud computing resources using a series of wireless communication standards or protocols or with wire, as described in the present invention. [00190] [00190] The modular communication center 203 can be used as a standalone device or be connected to compatible central network controllers and network switches to form a larger network. The modular communication center 203 is, in general, easy to install, configure and maintain, making it a good option for the network of devices 1a to 1n / 2a to 2m from the operating room. [00191] [00191] Figure 9 illustrates an interactive surgical system, implemented by computer 200. The interactive surgical system implemented by computer 200 is similar in many ways to the interactive surgical system, implemented by computer 100. For example, the interactive, implemented, surgical system per computer 200 includes one or more surgical systems 202, which are similar in many respects to surgical systems 102. Each surgical system 202 includes at least one central surgical controller 206 in communication with a cloud 204 which may include a remote server [00192] [00192] Figure 10 illustrates a central surgical controller 206 comprising a plurality of modules coupled to the modular control tower 236. The modular control tower 236 comprises a modular communication center 203, for example, a network connectivity device, and a computer system 210 for providing local processing, visualization, and imaging, for example. As shown in Figure 10, the modular communication center 203 can be connected in a layered configuration to expand the number of modules (for example, devices) that can be connected to the modular communication center 203 and transfer data associated with the modules to the computer system 210, cloud computing resources, or both. As shown in Figure 10, each of the central controllers / network switches in the modular communication center 203 includes three downstream ports and one upstream port. The central controller / network switch upstream is connected to a processor to provide a communication connection to the cloud computing resources and a local display 217. Communication with the cloud 204 can be done via a wired communication channel or wireless. [00193] [00193] The central surgical controller 206 uses a non-contact sensor module 242 to measure the dimensions of the operating room and generate a map of the operating room using non-contact measuring devices such as laser or ultrasonic. An ultrasound-based non-contact sensor module scans the operating room by transmitting an ultrasound explosion and receiving the echo when it bounces outside the perimeter of the operating room walls, as described under the heading "Surgical Hub Spatial Awareness Within an Operating Room "in US provisional patent application serial number 62 / 611,341, entitled INTERACTIVE SURGICAL PLATFORM, filed on December 28, 2017, whose description is hereby incorporated by reference in its entirety, in which the sensor module is configured to determine the size of the operating room and adjust the limits of the Bluetooth pairing distance. A laser-based non-contact sensor module scans the operating room by transmitting pulses of laser light, receiving pulses of laser light that bounce off the perimeter walls of the operating room, and comparing the phase of the transmitted pulse to the received pulse to determine the size of the operating room and to adjust the Bluetooth pairing distance limits, for example. [00194] [00194] Computer system 210 comprises a processor 244 and a network interface 245. Processor 244 is coupled to a communication module 247, storage 248, memory 249, non-volatile memory 250, and input / output interface 251 through of a system bus. The system bus can be any of several types of bus structures, including the memory bus or memory controller, a peripheral bus or external bus, and / or a local bus that uses any variety of available bus architectures including, but not limited to, not limited to, 9-bit bus, industry standard architecture (ISA), Micro-Charmel Architecture (MSA), extended ISA (EISA), smart drive electronics (IDE), VESA local bus (VLB), component interconnection peripherals (PCI), USB, accelerated graphics port (AGP), PCMCIA bus (International Personal Computer Memory Card Association, "Personal Computer Memory Card International Association"), Small Computer Systems Interface (SCSI), or any another proprietary bus. [00195] [00195] Processor 244 can be any single-core or multi-core processor, such as those known under the trade name ARM Cortex available from Texas Instruments. In one respect, the processor may be a Core Cortex-M4F LM4F230H5QR ARM processor, available from Texas Instruments, for example, which comprises an integrated 256 KB single-cycle flash memory, or other non-volatile memory, up to 40 MHz , a seek-ahead buffer to optimize performance above 40 MHz, a 32 KB single cycle serial random access memory (SRAM), an internal read-only memory (ROM) loaded with the StellarisWareO program, read-only memory programmable and electrically erasable (EEPROM) of 2 KB, one or more pulse width modulation (PWM) modules, one or more analogs of quadrature encoder (QEI) inputs, one or more analog to digital converters (ADC) of 12 bits with 12 analog input channels, details of which are available for the product data sheet. [00196] [00196] In one aspect, processor 244 may comprise a safety controller comprising two controller-based families, such as TMS570 and RM4x, known under the trade name Hercules ARM Cortex R4, also by Texas Instruments. The safety controller can be configured specifically for IEC 61508 and ISO 26262 safety critical applications, among others, to provide advanced integrated safety features while providing scalable performance, connectivity and memory options. [00197] [00197] System memory includes volatile and non-volatile memory. The basic input / output system (BIOS), containing the basic routines for transferring information between elements within the computer system, such as during startup, is stored in non-volatile memory. For example, non-volatile memory can include ROM, programmable ROM (PROM), electrically programmable ROM (EPROM), EEPROM or flash memory. Volatile memory includes random access memory (RAM), which acts as an external cache memory. In addition, RAM is available in many forms such as SRAM, dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), enhanced SDRAM (ESDRAM), Synchlink DRAM (SLDRAM), and direct RAM Rambus RAM (DRRAM). [00198] [00198] Computer system 210 also includes removable / non-removable, volatile / non-volatile computer storage media, such as disk storage Disk storage includes, but is not limited to, devices such as a drive magnetic disk, floppy disk drive, tape drive, Jaz drive, Zip drive, LS-60 drive, flash memory card or memory stick (pen-drive). In addition, the storage disc may include storage media separately or in combination with other storage media including, but not limited to, an optical disc drive such as a compact disc ROM (CD-ROM) device recordable (CD-R Drive), rewritable compact disc drive (CD-RW drive), or a versatile digital ROM drive (DVD-ROM). To facilitate the connection of disk storage devices to the system bus, a removable or non-removable interface can be used. [00199] [00199] It is to be understood that computer system 210 includes software that acts as an intermediary between users and the basic resources of the computer described in a suitable operating environment. Such software includes an operating system. The operating system, which can be stored on disk storage, acts to control and allocate computer system resources. System applications benefit from the management capabilities of the operating system through program modules and “program data stored in system memory or on the storage disk. It is to be understood that the various components “described in the present invention can be implemented with various operating systems or combinations of operating systems. [00200] [00200] "A user enters commands or information into the computer system 210 through the input device (s) coupled to the 1 / O 251. interface. The input devices include, but are not limited to, a pointing device such as a mouse, trackball, stylus, touchpad, keyboard, microphone, joystick, game pad, satellite card, scanner, TV tuner card, digital camera, digital video camera, web camera, and the like. These and other input devices connect to the processor via the system bus via the interface port (s). The interface ports include, for example, a serial port, a parallel port, a game port and a USB. Output devices use some of the same types of ports as input devices. In this way, for example, a USB port can be used to provide input to the computer system and to provide information from the computer system to an output device. An output adapter is provided to illustrate that there are some output devices such as monitors, screens, speakers, and printers, among other output devices, that need special adapters. Output adapters include, by way of illustration and not limitation, video and sound cards that provide a means of connection between the output device and the system bus. It should be noted that other devices and / or device systems, such as remote computers, provide input and output capabilities. [00201] [00201] Computer system 210 can operate in a networked environment using logical connections with one or more remote computers, such as cloud computers, or local computers. Remote cloud computers can be a personal computer, server, router, personal network computer, workstation, microprocessor-based device, peer device, or other common network node, and the like, and typically include many or all elements described in relation to the computer system. For the sake of brevity, only one memory storage device is illustrated with the remote computer. Remote computers are logically connected to the computer system via a network interface and then physically connected via a communication connection. The network interface covers communication networks such as local area networks (LANs) and wide area networks (WANs). LAN technologies include fiber distributed data interface (FDDI), copper distributed data interface (CDDI), Ethernet / IEEE 802.3, Token / IEEE 802.5 ring and the like. WAN technologies include, but are not limited to, point-to-point links, circuit switching networks such as digital integrated service networks (ISDN) and variations in them, packet switching networks and digital subscriber lines (DSL). [00202] [00202] In several respects, computer system 210 of Figure 10, imaging module 238 and / or display system 208, and / or processor module 232 of Figures 9 to 10, may comprise an image processor, image processing engine, media processor or any specialized digital signal processor (PSD) used for processing digital images. [00203] [00203] Communication connections refer to the hardware / software used to connect the network interface to the bus. Although the communication connection is shown for illustrative clarity within the computer system, it can also be external to computer system 210. The hardware / software required for connection to the network interface includes, for illustrative purposes only, internal and external technologies such as modems, including regular telephone serial modems, cable modems and DSL modems, ISDN adapters, and Ethernet cards. [00204] [00204] Figure 11 illustrates a functional block diagram of an aspect of a USB 300 central network controller device, in accordance with an aspect of the present description. In the illustrated aspect, the USB 300 network controller device uses a TUSB2036 integrated circuit central controller available from Texas Instruments. The central USB network controller 300 is a CMOS device that provides one USB transceiver port 302 and up to three USB transceiver ports downstream 304, 306, 308 in accordance with the USB 2.0 specification. Upstream USB transceiver port 302 is a differential data root port comprising a "minus" differential data input (DMO) paired with a "plus" differential data input (DPO). The three ports of the downstream USB transceiver 304, 306, 308 are differential data ports, with each port including "more" differential data outputs (DP1-DP3) paired with "less" differential data outputs (DM1-DM3) . [00205] [00205] The USB 300 central network controller device is implemented with a digital state machine instead of a microcontroller, and no firmware programming is required. Fully compatible USB transceivers are integrated into the circuit for the upstream USB transceiver port 302 and all downstream USB transceiver ports 304, 306, 308. The downstream USB transceiver ports 304, 306, 308 support both full speed as low speed automatically configuring the scan rate according to the speed of the device attached to the doors. The USB 300 network central controller device can be configured in bus powered or self powered mode and includes 312 central power logic to manage power. [00206] [00206] The USB 300 network central controller device includes a 310 series interface engine (SIE). The SIE 310 is the front end of the USB 300 central network controller hardware and handles most of the protocol described in chapter 8 of the USB specification. SIE 310 typically comprises signaling down to the transaction level. The functions it handles could include: packet recognition, transaction sequencing, SOP, EOP, RESET, and RESUME signal detection / generation, clock / data separation, data encoding / decoding non-inverted zero (NRZI ), generation and verification of CRC (token and data), generation and verification / decoding of packet ID (PID), and / or series-parallel / parallel-series conversion. The 310 receives a clock input 314 and is coupled to a suspend / resume logic circuit and frame timer 316 and a central circuit repeat loop 318 to control communication between the upstream USB transceiver port 302 and the transceiver ports Downstream USB 304, 306, 308 through the logic circuits of ports 320, 322, 324. The SIE 310 is coupled to a command decoder 326 through the logic interface to control the commands of a serial EEPROM via an EEPROM interface in series 330. [00207] [00207] In several aspects, the USB 300 central network controller can connect 127 functions configured in up to six logical layers (levels) to a single computer. In addition, the USB 300 central network controller can connect all peripherals using a standardized four-wire cable that provides both communication and power distribution. The power settings are bus-powered and self-powered modes. The USB 300 central network controller can be configured to support four power management modes: a bus-powered central controller with individual port power management or grouped port power management, and the self-powered central controller with power management. individual port power or grouped port power management. In one aspect, using a USB cable, the USB 300 central network controller, the USB upstream transceiver port 302 is plugged into a USB host controller, and the downstream USB transceiver ports 304, 306, 308 are exposed to connect compatible USB devices, and so on. Surgical instrument hardware [00208] [00208] Figure 12 illustrates a logic diagram of a module of a 470 control system of a surgical instrument or tool, according to one or more aspects of the present description. The 470 system comprises a control circuit. The control circuit includes a microcontroller 461 comprising a processor 462 and a memory 468. One or more of the sensors 472, 474, 476, for example, provide real-time feedback to processor 462. A motor 482, driven by a driver motor 492, operationally couples a longitudinally movable displacement member to drive the beam cutting element with | A tracking system 480 is configured to determine the position of the longitudinally movable displacement member. Position information is provided to the 462 processor, which can be programmed or configured to determine the position of the longitudinally movable drive member, as well as the position of a firing member, firing bar and cutting beam member with profiled beam. |. Additional motors can be provided at the instrument driver interface to control the firing of the beam with an | profile, the displacement of the closing tube, the rotation of the drive shaft and the articulation. A 473 screen displays a variety of instrument operating conditions and can include touchscreen functionality for data entry. The information displayed on screen 473 can be overlaid with images captured using endoscopic imaging modules. [00209] [00209] In one aspect, the 461 microcontroller can be any single-core or multi-core processor, such as those known under the ARM Cortex trade name available from Texas Instruments. In one respect, the main microcontroller 461 can be an LM4F230H5QR ARM Cortex-M4F processor, available from Texas Instruments, for example, which comprises an integrated 256 KB single cycle flash memory, or other non-volatile memory, up to 40 MHz, a seek-ahead buffer to optimize performance above 40 MHz, a 32 KB single cycle serial random access memory (SRAM), an internal read-only memory (ROM) loaded with the StellarisWareO program, programmable memory and 2 KB electronically erasable (EEPROM), one or more pulse width modulation (PWM) modules, one or more analogs of quadrature encoder (QEI) inputs, and / or one or more analog converters for 12-bit digital (ADC) with 12 channels of analog input, details of which are available for the product data sheet. [00210] [00210] In one aspect, the 461 microcontroller may comprise a safety controller that comprises two families based on controllers, such as TMS570 and RM4x known under the trade name Hercules ARM Cortex R4, also available from Texas Instruments. The safety controller can be configured specifically for IEC 61508 and ISO 26262 safety critical applications, among others, to provide advanced integrated safety features while providing scalable performance, connectivity and memory options. [00211] [00211] The 461 microcontroller can be programmed to perform various functions, such as precise control of the speed and position of the knife and articulation systems. In one aspect, the microcontroller 461 includes a processor 462 and a memory 468. The electric motor 482 can be a brushed direct current (DC) motor with a gearbox and mechanical connections with an articulation or scalpel system. In one aspect, a motor drive 492 can be an A3941 available from Allegro Microsystems, Inc. Other motor drives can be readily replaced for use in tracking system 480 which comprises an absolute positioning system. A detailed description of an absolute positioning system is provided in US Patent Application Publication No. 2017/0296213, entitled [00212] [00212] The 461 microcontroller can be programmed to provide precise control of the speed and position of the displacement members and articulation systems. The 461 microcontroller can be configured to compute a response in the 461 microcontroller software. The computed response is compared to a measured response from the real system to obtain an "observed" response, which is used for actual feedback-based decisions. The observed response is a favorable and adjusted value, which balances the uniform and continuous nature of the simulated response with the measured response, which can detect external influences in the system. [00213] [00213] In one aspect, motor 482 can be controlled by motor driver 492 and can be used by the instrument's trigger system or surgical tool. In many ways, the 482 motor can be a brushed direct current (DC) drive motor, with a maximum speed of approximately 25,000 RPM, for example. In other arrangements, the 482 motor may include a brushless motor, a wireless motor, a synchronous motor, a stepper motor or any other suitable type of electric motor. Motor starter 492 may comprise an H bridge starter comprising field effect transistors (FETs), for example. The 482 motor can be powered by a feed assembly releasably mounted on the handle assembly or tool compartment to provide control power for the instrument or surgical tool. The power pack may comprise a battery that may include several battery cells connected in series, which can be used as the power source to power the instrument or surgical tool. In certain circumstances, the battery cells in the power pack may be replaceable and / or rechargeable. In at least one example, the battery cells can be lithium ion batteries that can be coupled and separable from the power pack. [00214] [00214] The 492 motor driver can be an A3941, available from Allegro Microsystems, Inc. The 492 A3941 driver is an entire bridge controller for use with external power semiconductor metal oxide field (MOSFET) transistors. , of N channel, specifically designed for inductive loads, such as brushed DC motors. The 492 actuator comprises a single charge pump regulator that provides full door drive (> 10 V) for batteries with voltage up to 7 Ve and allows the A3941 to operate with a reduced door drive, up to 5.5 V. input command can be used to supply the voltage surpassing that supplied by the battery required for the N channel MOSFETs. An internal charge pump for the upper side drive allows operation in direct current (100% duty cycle). The entire bridge can be triggered in fast or slow drop modes using diodes or synchronized rectification. In the slow drop mode, the current can be recirculated by means of FET from the top or from the bottom. The energy FETs are protected from the shoot-through effect through programmable dead-time resistors. Integrated diagnostics provide indication of undervoltage, overtemperature and faults in the power bridge, and can be configured to protect power MOSFETs in most short-circuit conditions. Other motor drives can be readily replaced for use in the tracking system 480 comprising an absolute positioning system. [00215] [00215] The tracking system 480 comprises a controlled motor drive circuit arrangement comprising a position sensor 472 in accordance with an aspect of the present description. The position sensor 472 of an absolute positioning system provides a unique position signal that corresponds to the location of a displacement member. [00216] [00216] The 482 electric motor may include a rotary drive shaft, which interfaces operationally with a gear set, which is mounted on a coupling hitch with a set or rack of driving teeth on the driving member. A sensor element can be operationally coupled to a gear assembly so that a single revolution of the position sensor element 472 corresponds to some linear longitudinal translation of the displacement member. An array of gears and sensors can be connected to the linear actuator by means of a rack and pinion arrangement, or by a rotary actuator, by means of a sprocket or other connection. A power supply provides power to the absolute positioning system and an output indicator can display the output from the absolute positioning system. The drive member represents the longitudinally movable drive member comprising a rack of drive teeth formed thereon for engagement with a corresponding drive gear of the gear reducer assembly. The displacement member represents the longitudinally movable firing member, the firing bar, the beam with | or combinations thereof. [00217] [00217] A single evolution of the sensor element associated with the position sensor 472 is equivalent to a longitudinal linear displacement d1 of the displacement member, where d1 is the longitudinal linear distance that the displacement member travels from point "a" to point "b "after a single revolution of the sensor element coupled to the displacement member. The sensor arrangement can be connected by means of a gear reduction which results in the position sensor 472 completing one or more revolutions for the complete travel of the displacement member. The 472 position sensor can complete multiple revolutions for the full travel of the displacement member. [00218] [00218] A series of keys, where n is an integer greater than one, can be used alone or in combination with a gear reduction to provide a unique position signal for more than one revolution of the 472 position sensor. of the keys is transmitted back to microcontroller 461 which applies logic to determine a unique position signal corresponding to the longitudinal linear displacement of d1 + d2 + ... dh of the displacement member. The output of the position sensor 472 is supplied to the microcontroller 461. In several embodiments, the position sensor 472 of the sensor arrangement may comprise a magnetic sensor, an analog rotary sensor, such as a potentiometer, or a series of analog Hall effect elements. , which emit a unique combination of position of signs or values. [00219] [00219] The position sensor 472 can comprise any number of magnetic detection elements, such as, for example, magnetic sensors classified according to whether they measure the total magnetic field or the vector components of the magnetic field. The techniques used to produce both types of magnetic sensors cover many aspects of physics and electronics. Technologies used for magnetic field detection include flow meter, saturated flow, optical pumping, nuclear precession, SQUID, Hall effect, anisotropic magnetoresistance, giant magnetoresistance, magnetic tunnel junctions, giant magnetoimpedance, magnetostrictive / piesoelectric compounds, magnetodiode, magnetic transistor, fiber optics, magneto-optics and magnetic sensors based on microelectromechanical systems, among others. [00220] [00220] In one aspect, the position sensor 472 for the tracking system 480 which comprises an absolute positioning system comprises a magnetic rotating absolute positioning system. The 472 position sensor can be implemented as an ASSOSSEQFT magnetic and single-circuit rotary position sensor, available from Austria Microsystems, AG. The position sensor 472 interfaces with the 461 microcontroller to provide an absolute positioning system. The 472 position sensor is a low voltage, low power component and includes four effect elements in an area of the 472 position sensor located above a magnet. A high-resolution ADC and an intelligent power management controller are also provided on the integrated circuit. A CORDIC (digital computer for coordinate rotation) processor, also known as the digit-for-digit method and Volder algorithm, is provided to implement a simple and efficient algorithm for calculating hyperbolic and trigonometric functions that require only addition, subtraction, displacement operations bits and lookup table. The angle position, alarm bits and magnetic field information are transmitted via a standard serial communication interface, such as a serial peripheral interface (SPI), to the 461 microcontroller. The 472 position sensor provides 12 or 14 bits of resolution. The position sensor 472 can be an ASS055 integrated circuit supplied in a small 16-pin QFN package whose measurement corresponds to 4 x 4 x 0.85 mm. [00221] [00221] The tracking system 480 comprising an absolute positioning system can comprise and / or be programmed to implement a feedback controller, such as a PID, state feedback, and adaptive controller. A power supply converts the signal from the feedback controller to a physical input to the system, in this case the voltage. Other examples include a voltage, current and force PWM. Other sensors can be provided to measure the parameters of the physical system in addition to the position measured by the position sensor 472. In some respects, the other sensors may include sensor arrangements as described in US patent No. 9,345,481 entitled STAPLE CARTRIDGE TISSUE THICKNESS SENSOR SYSTEM, granted on May 24, 2016, which is incorporated by reference in its entirety into this document; US patent application serial number 2014/0263552, entitled STAPLE CARTRIDGE TISSUE THICKNESS SENSOR SYSTEM, published on September 18, 2014, is incorporated by reference in its entirety into this document; and US patent application serial number 15 / 628,175, entitled TECHNIQUES FOR ADAPTIVE CONTROL OF [00222] [00222] The absolute positioning system provides an absolute positioning of the displaced member on the activation of the instrument without having to retract or advance the longitudinally movable driving member to the restart position (zero or initial), as may be required by the encoders conventional rotating machines that merely count the number of progressive or regressive steps that the 482 motor has traveled to infer the position of a device actuator, actuation bar, scalpel, and the like. [00223] [00223] “A 474 sensor, such as a strain gauge or a micro strain gauge, is configured to measure one or more parameters of the end actuator, such as, for example, the amplitude of the strain exerted on the anvil during a gripping operation, which can be indicative of tissue compression. The measured effort is converted into a digital signal and fed to the 462 processor. Alternatively, or in addition to the 474 sensor, a 476 sensor, such as a load sensor, can measure the closing force applied by the drive system. anvil closure. The 476 sensor, such as a load sensor, can measure the firing force applied to a beam with a | in a course of firing the instrument or surgical tool. The beam with profile in | it is configured to engage a wedge slide, which is configured to move the clamp drivers upward to force the clamps to deform in contact with an anvil. Beam with profile in | includes a sharp cutting edge that can be used to separate fabric as the beam with a profile | it is advanced distally by the firing bar. Alternatively, a current sensor 478 can be used to measure the current drained by the 482 motor. The force required to advance the trigger member can correspond to the current drained by the 482 motor, for example. The measured force is converted into a digital signal and supplied to the 462 processor. [00224] [00224] In one form, a 474 strain gauge sensor can be used to measure the force applied to the tissue by the end actuator. A strain gauge can be attached to the end actuator to measure the force applied to the tissue being treated by the end actuator. A system for measuring forces applied to the tissue attached by the end actuator comprises a 474 strain gauge sensor, such as, for example, a microstrain meter, which is configured to measure one or more parameters of the end actuator, for example. In one aspect, the 474 strain gauge sensor can measure the amplitude or magnitude of the mechanical stress exerted on a claw member of an end actuator during a gripping operation, which can be indicative of tissue compression. The measured effort is converted into a digital signal and fed to the 462 processor of a microcontroller [00225] [00225] Measurements of tissue compression, tissue thickness and / or force required to close the end actuator on the tissue, as measured by sensors 474, 476 respectively, can be used by microcontroller 461 to characterize the selected position of the limb and / or the corresponding value of the speed of the firing member. In one case, a 468 memory can store a technique, an equation and / or a look-up table that can be used by the 461 microcontroller in the evaluation. [00226] [00226] The control system 470 of the instrument or surgical tool can also comprise wired or wireless communication circuits for communication with the modular communication center shown in Figures 8 to 11. [00227] [00227] Figure 13 illustrates a control circuit 500 configured to control aspects of the instrument or surgical tool according to an aspect of the present description. The control circuit 500 can be configured to implement various processes described herein. The control circuit 500 may comprise a microcontroller comprising one or more processors 502 (for example, microprocessor, microcontroller) coupled to at least one memory circuit 504. The memory circuit 504 stores instructions executable on a machine that, when executed by the processor 502, cause the 502 processor to execute machine instructions to implement several of the processes described here. The 502 processor can be any one of a number of single-core or multi-core processors known in the art. The memory circuit 504 may comprise volatile and non-volatile storage media. The processor 502 can include an instruction processing unit 506 and an arithmetic unit 508. The instruction processing unit can be configured to receive instructions from the memory circuit 504 of the present description. [00228] [00228] Figure 14 illustrates a combinational logic circuit 510 configured to control aspects of the instrument or surgical tool according to an aspect of the present description. The combinational logic circuit 510 can be configured to implement various processes described herein. The combinational logic circuit 510 can comprise a finite state machine comprising a combinational logic 512 configured to receive data associated with the surgical instrument or tool at an input 514, process the data by combinational logic 512 and provide an output 516. [00229] [00229] Figure 15 illustrates a sequential logic circuit 520 configured to control aspects of the instrument or surgical tool according to an aspect of the present description. Sequential logic circuit 520 or combinational logic 522 can be configured to implement the process described herein. Sequential logic circuit 520 may comprise a finite state machine. Sequential logic circuit 520 may comprise combinational logic 522, at least one memory circuit 524, a clock 529 and, for example. The at least one memory circuit 524 can store a current state of the finite state machine. In certain cases, the sequential logic circuit 520 may be synchronous or asynchronous. Combinational logic 522 is configured to receive data associated with the surgical instrument or tool from an input 526, process the data by combinational logic 522, and provide an output 528. In other respects, the circuit may comprise a combination of a processor (for example , processor 502, Figure 13) and a finite state machine for implementing various processes of the present invention. In other respects, the finite state machine may comprise a combination of a combinational logic circuit (for example, a combinational logic circuit 510, Figure 14) and the sequential logic circuit 520. [00230] [00230] Figure 16 illustrates an instrument or surgical tool that comprises a plurality of motors that can be activated to perform various functions. In certain cases, a first engine can be activated to perform a first function, a second engine can be activated to perform a second function, a third engine can be activated to perform a third function, a fourth engine can be activated to perform a fourth function, and so on. In certain cases, the plurality of motors of the robotic surgical instrument 600 can be individually activated to cause firing, closing, and / or articulation movements in the end actuator. The firing, closing and / or articulation movements can be transmitted to the end actuator through a drive shaft assembly, for example. [00231] [00231] In certain cases, the instrument or surgical tool system may include a 602 firing motor. The 602 firing motor can be operationally coupled to a 604 firing motor drive assembly, which can be configured to transmit movement from trigger, generated by motor 602, to the end actuator, in particular, to move the beam element with | In certain cases, the firing movements generated by the firing motor 602 can cause the staples to be implanted from the staple cartridge in the fabric captured by the end actuator and / or by the cutting edge of the beam element with profile in | to be advanced in order to cut the captured tissue, for example. The beam element with profile in | can be retracted by reversing the direction of motor 602. [00232] [00232] In certain cases, the surgical instrument or tool may include a closing motor 603. The closing motor 603 can be operationally coupled to a drive assembly of the closing motor 605 that can be configured to transmit closing movements generated by the motor 603 to the end actuator, particularly to move a closing tube to close the anvil and compress the fabric between the anvil and the staple cartridge. Closing movements can cause the end actuator to transition from an open configuration to an approximate configuration to capture tissue, for example. The end actuator can be moved to an open position by reversing the direction of the 603 motor. [00233] [00233] In certain cases, the surgical instrument or tool may include one or more articulation motors 606a, 606b, for example. The motors 606a, 606b can be operationally coupled to the drive assemblies of the articulation motor 608a, 608b, which can be configured to transmit articulation movements generated by the motors 606a, 606b to the end actuator. In certain cases, the articulation movements can cause the end actuator to be articulated in relation to the drive shaft assembly, for example. [00234] [00234] As described above, the surgical instrument or tool can include a plurality of motors that can be configured to perform various independent functions. In certain cases, the plurality of motors of the instrument or surgical tool can be activated individually or separately to perform one or more functions, while other motors remain inactive. For example, the articulation motors 606a, 606b can be activated to cause the end actuator to be articulated, while the firing motor 602 remains inactive. Alternatively, the firing motor 602 can be activated to fire the plurality of clamps, and / or advance the cutting edge, while the hinge motor 606 remains inactive. In addition, the closing motor 603 can be activated simultaneously with the firing motor 602 to make the closing tube or the beam element with profile in | proceed distally, as described in more detail later in this document. [00235] [00235] In certain cases, the instrument or surgical tool may include a common control module 610 that can be used with a plurality of motors of the instrument or surgical tool. In certain cases, the common control module 610 can accommodate one of the plurality of motors at a time. For example, the common control module 610 can be coupled to and separable from the plurality of motors of the robotic surgical instrument individually. In certain cases, a plurality of surgical instrument or tool motors may share one or more common control modules, such as the common control module 610. In certain cases, a plurality of surgical instrument or tool motors may be individually and selectively engaged to the common control module 610. In certain cases, the common control module 610 can be selectively switched between interfacing with one of a plurality of instrument motors or surgical tool to interface with another among the plurality of instrument motors or surgical tool. [00236] [00236] In at least one example, the common control module 610 can be selectively switched between the operating coupling with the 606a, 606b articulation motors, and the operating coupling with the 602 firing motor or the 603 closing motor. at least one example, as shown in Figure 16, a key 614 can be moved or transitioned between a plurality of positions and / or states. In the first position 616, the switch 614 can electrically couple the common control module 610 to the trip motor 602; in a second position 617, the switch 614 can electrically couple the control module 610 to the closing motor 603; in a third position 618a, the switch 614 can electrically couple the common control module 610 to the first articulation motor 606a; and in a fourth position 618b, the switch 614 can electrically couple the common control module 610 to the second articulation motor 606b, for example. In certain cases, separate common control modules 610 can be electrically coupled to the firing motor 602, closing motor 603, and hinge motors 606a, 606b at the same time. In certain cases, key 614 can be a mechanical key, an electromechanical key, a solid state key, or any suitable switching mechanism. [00237] [00237] Each of the motors 602, 603, 606a, 606b can comprise a torque sensor to measure the output torque on the motor drive shaft. The force on an end actuator can be detected in any conventional manner, such as by means of force sensors on the outer sides of the jaws or by a motor torque sensor that drives the jaws. [00238] [00238] In several cases, as shown in Figure 16, the common control module 610 may comprise a motor starter 626 that may comprise one or more H-Bridge FETs. The motor driver 626 can modulate the energy transmitted from a power source 628 to a motor coupled to the common control module 610, based on an input from a microcontroller 620 (the "controller"), for example. In certain cases, Microcontroller 620 can be used to determine the current drawn by the motor, for example, while the motor is coupled to the common control module 610, as described above. [00239] [00239] In certain examples, the microcontroller 620 may include a microprocessor 622 (the "processor") and one or more non-transitory computer-readable media or 624 memory units (the "memory"). In certain cases, memory 624 can store various program instructions which, when executed, can cause processor 622 to perform a plurality of functions and / or calculations described herein. In certain cases, one or more of the memory units 624 can be coupled to the processor 622, for example. [00240] [00240] In certain cases, the power supply 628 can be used to supply power to the microcontroller 620, for example. [00241] [00241] In several cases, the 622 processor can control the motor driver 626 to control the position, direction of rotation and / or speed of a motor that is coupled to the common control module 610. In certain cases, the processor 622 can signal the motor driver 626 to stop and / or disable a motor that is coupled to the common control module 610. It should be understood that the term "processor", as used here, includes any microprocessor, microcontroller or other control device. adequate basic computing that incorporates the functions of a central computer processing unit (CPU) in an integrated circuit or, at most, some integrated circuits. The processor is a programmable multipurpose device that accepts digital data as input, processes it according to instructions stored in its memory, and provides results as output. This is an example of sequential digital logic, as it has internal memory. Processors operate on numbers and symbols represented in the binary numeral system. [00242] [00242] In one example, the 622 processor can be any single-core or multi-core processor, such as those known by the Texas Instruments ARM Cortex trade name. In certain cases, the 620 microcontroller may be an LM 4F230H50OR, available from Texas Instruments, for example. In at least one example, the Texas Instruments LM4F230H5QR is an ARM Cortex-M4F processor core that comprises a 256 KB single cycle flash integrated memory, or other non-volatile memory, up to 40 MHz, a prefetch buffer for optimize performance above 40 MHz, a 32 KB single cycle SRAM, an internal ROM loaded with StellarisWare & software, 2 KB EEPROM, one or more PWM modules, one or more QEI analogs, one or more ADCs of 12 bits with 12 channels of analog input, among other features that are readily available for the product data sheet. Other microcontrollers can be readily replaced for use with the 4410 module. Consequently, the present description should not be limited in this context. [00243] [00243] In certain cases, memory 624 may include program instructions for controlling each of the motors of the surgical instrument 600 which are attachable to common control module 610. For example, memory 624 may include program instructions for controlling the motor trigger 602, closing motor 603 and hinge motors 606a, 606b. Such program instructions can cause the 622 processor to control the trigger, close, and link functions according to inputs from the instrument or surgical tool control algorithms or programs. [00244] [00244] In certain cases, one or more mechanisms and / or sensors, such as 630 sensors, can be used to alert the 622 processor about the program instructions that need to be used in a specific configuration. For example, sensors 630 can alert the 622 processor to use the program instructions associated with triggering, closing, and pivoting the end actuator. In certain cases, sensors 630 may comprise position sensors that can be used to detect the position of switch 614, for example. Consequently, the 622 processor can use the program instructions associated with firing the beam with | the end actuator by detecting, through sensors 630, for example, that key 614 is in first position 616; the processor 622 can use the program instructions associated with closing the anvil upon detection through sensors 630, for example, that switch 614 is in second position 617; and processor 622 can use the program instructions associated with the articulation of the end actuator upon detection through sensors 630, for example, that switch 614 is in the third or fourth position 618a, 618b. [00245] [00245] Figure 17 is a schematic diagram of a robotic surgical instrument 700 configured to operate a surgical tool described in this document, in accordance with an aspect of that description. The robotic surgical instrument 700 can be programmed or configured to control the distal / proximal translation of a displacement member, the distal / proximal displacement of a closing tube, the rotation of the drive shaft, and articulation, either with a single type or multiple articulation drive links. In one aspect, the surgical instrument 700 can be programmed or configured to individually control a firing member, a closing member, a driving shaft member and / or one or more hinge members. The surgical instrument 700 comprises a control circuit 710 configured to control motor-driven firing members, closing members, driving shaft members and / or one or more hinge members. [00246] [00246] In one aspect, the robotic surgical instrument 700 comprises a control circuit 710 configured to control an anvil 716 and a beam portion with profile in | 714 (including a sharp cutting edge) of an end actuator 702, a removable clamp cartridge 718, a drive shaft 740 and one or more hinge members 742a, 742b through a plurality of motors 704a to 704e. A 734 position sensor can be configured to provide position feedback on the beam with profile | 714 to control circuit 710. Other sensors 738 can be configured to provide feedback to control circuit 710. A timer / counter 731 provides timing and counting information to control circuit 710. A power source 712 can be provided to operate the motors 704a to 704e and a current sensor 736 provide motor current feedback to control circuit 710. Motors 704a to 704e can be operated individually by control circuit 710 in an open loop or closed loop feedback control. [00247] [00247] In one aspect, the control circuit 710 may comprise one or more microcontrollers, microprocessors or other processors suitable for executing instructions that cause the processor or processors to perform one or more tasks. In one aspect, a timer / counter 731 provides an output signal, such as elapsed time or a digital count, to control circuit 710 to correlate the position of the beam with | 714, as determined by the position sensor 734, with the timer / counter output 731, so that the control circuit 710 can determine the position of the beam with profile in | 714 at a specific time (t) in relation to an initial position or time (t) when the beam with profile in | 714 is in a specific position in relation to an initial position. The timer / counter 731 can be configured to measure elapsed time, count external events or measure external events. [00248] [00248] In one aspect, control circuit 710 can be programmed to control functions of end actuator 702 based on one or more tissue conditions. Control circuit 710 can be programmed to directly or indirectly detect tissue conditions, such as thickness, as described here. Control circuit 710 can be programmed to select a trigger control program or closing control program based on tissue conditions. A trigger control program can describe the distal movement of the displacement member. Different trigger control programs can be selected to better treat different tissue conditions. For example, when thicker tissue is present, control circuit 710 can be programmed to translate the displacement member at a lower speed and / or with a lower power. When a thinner fabric is present, the control circuit 710 can be programmed to move the displacement member at a higher speed and / or with greater power. A closing control program can control the closing force applied to the tissue by the anvil 716. Other control programs control the rotation of the drive shaft 740 and the hinge members 742a, 742b. [00249] [00249] In one aspect, the 710 motor control circuit can generate motor setpoint signals. Motor setpoint signals can be supplied to several motor controllers 708a through 708e. Motor controllers 708a to 708e can comprise one or more circuits configured to provide motor drive signals to motors 704a to 704e in order to drive motors / 704a to 704e, as described here. In some instances, motors 704a to 704e may be brushed DC motors. For example, the speed of motors 704a to 704e can be proportional to the respective motor start signals. In some examples, motors 704a to 704e may be brushless DC electric motors, and the respective motor drive signals may comprise a PWM signal provided for one or more motor stator windings [00250] [00250] In one aspect, the control circuit 710 can initially operate each of the motors 704a to 704e in an open circuit configuration for a first open circuit portion of the travel of the displacement member. Based on the response of the robotic surgical instrument 700 during the open circuit portion of the stroke, control circuit 710 can select a trigger control program in a closed circuit configuration. The instrument response may include a translation of the distance of the displacement member during the open circuit portion, a time elapsed during the open circuit portion, the energy supplied to one of the motors 704a to 704e during the open circuit portion, a sum pulse widths of a motor start signal, etc. After the open circuit portion, control circuit 710 can implement the selected trigger control program for a second portion of the travel member travel. For example, during a portion of the closed loop course, control circuit 710 can modulate one of the motors 704a to 704e based on the translation of data describing a position of the closed displacement member to translate the displacement member to a constant speed. [00251] [00251] In one aspect, motors 704a to 704e can receive power from a power source 712. Power source 712 can be a DC power source powered by an alternating main power supply, a battery, a super capacitor, or any other suitable power source. Motors 704a to 704e can be mechanically coupled to individual moving mechanical elements, such as the beam with profile in | 714, the anvil 716, the drive shaft 740, the hinge 742a and the hinge 742b, by means of the respective transmissions 706a to 706e. Transmissions 706a through 706e may include one or more gears or other connecting components for coupling motors 704a to 704e to moving mechanical elements. A 734 position sensor can detect a beam position with a | 714. The position sensor 734 can be or can include any type of sensor that is capable of generating position data that indicates a beam position profiled in 1 714. In some examples, the position sensor 734 can include a configured encoder to supply a series of pulses to the control circuit 710 according to the beam with profile in | 714 transferred distally and proximally. The control circuit 710 can track the pulses to determine the position of the beam with profile in | 714. Other suitable position sensors can be used, including, for example, a proximity sensor. Other types of position sensors can provide other signals that indicate the movement of the beam with 1 714 profile. In addition, in some examples, the position sensor 734 can be omitted. When any of the motors 704a to 704e is a stepper motor, the control circuit 710 can track the beam position with a profile in 1 714 adding the number and direction of the steps that the 704 motor was instructed to perform. Position sensor 734 can be located on end actuator 702 or any other portion of the instrument. The outputs of each of the engines 704a to 704e include a torque sensor 744a to 744e to detect force and have an encoder to detect the rotation of the drive shaft. [00252] [00252] In one aspect, control circuit 710 is configured to drive a firing member like the beam portion with 1714 profile of end actuator 702. Control circuit 710 provides a motor setpoint for a control motor 708a, which provides a drive signal to motor 704a. The output shaft of the motor 704a is coupled to a torque sensor 744a. The torque sensor [00253] [00253] In one aspect, control circuit 710 is configured to drive a closing member, such as anvil portion 716 of end actuator 702. Control circuit 710 provides a motor setpoint to a motor control 708b, which provides a drive signal to motor 704b. The output shaft of the 704b motor is coupled to a 744b torque sensor. The torque sensor 744b is coupled to a transmission 706b which is coupled to the anvil 716. The transmission 706b comprises moving mechanical elements, such as rotating elements and a closing member, to control the movement of the anvil 716 between the open and closed positions. In one aspect, the 704b motor is coupled to a closing gear assembly, which includes a closing reduction gear assembly that is supported in gear engaged with the closing sprocket. The 744b torque sensor provides a closing force feedback signal to the control circuit [00254] [00254] In one aspect, control circuit 710 is configured to rotate a drive shaft member, such as drive shaft 740, to rotate end actuator 702. Control circuit 710 provides a motor setpoint to a motor control 708c, which provides a drive signal to the motor 704c. The output shaft of the motor 704c is coupled to a torque sensor 744c. The torque sensor 744c is coupled to a transmission 706c which is coupled to the shaft 740. The transmission 706c comprises moving mechanical elements, such as rotary elements, to control the rotation of the drive shaft 740 clockwise or counterclockwise until and above 360º. In one aspect, the 704c engine is coupled to the rotary drive assembly, which includes a pipe gear segment that is formed over (or attached to) the proximal end of the proximal closing tube for operable engagement by a rotational gear assembly that is supported operationally on the tool mounting plate. The torque sensor 744c provides a rotation force feedback signal to control circuit 710. The rotation force feedback signal represents the rotation force applied to the drive shaft [00255] [00255] In one aspect, control circuit 710 is configured to articulate end actuator 702. Control circuit 710 provides a motor setpoint to a 708d motor control, which provides a drive signal to motor 704d . The output shaft of the 704d motor is coupled to a 744d torque sensor. The torque sensor 744d is coupled to a transmission 706d which is coupled to a pivot member 742a. The 706d transmission comprises moving mechanical elements, such as articulation elements, to control the articulation of the 702 + 65º end actuator. In one aspect, the 704d motor is coupled to a pivot nut, which is rotatably seated on the proximal end portion of the distal column portion and is pivotally driven thereon by a pivot gear assembly. The torque sensor 744d provides a hinge force feedback signal to control circuit 710. The hinge force feedback signal represents the hinge force applied to the end actuator 702. The 738 sensors, as a hinge encoder , can provide the pivoting position of end actuator 702 for control circuit 710. [00256] [00256] In another aspect, the articulation function of the robotic surgical system 700 may comprise two articulation members, or connections, 742a, 742b. These hinge members 742a, 742b are driven by separate disks at the robot interface (the rack), which are driven by the two motors 708d, 708e. When the separate firing motor 704a is provided, each hinge link 742a, 742b can be antagonistically driven with respect to the other link to provide a resistive holding movement and a load to the head when it is not moving and to provide a movement of articulation when the head is articulated. The hinge members 742a, 742b attach to the head in a fixed radius when the head is rotated. Consequently, the mechanical advantage of the push and pull link changes when the head is rotated. This change in mechanical advantage can be more pronounced with other drive systems for the articulation connection. [00257] [00257] In one aspect, the one or more motors 704a to 704e may comprise a brushed DC motor with a gearbox and mechanical connections to a firing member, closing member or articulation member. Another example includes electric motors 704a to 704e that operate the moving mechanical elements such as the displacement member, the articulation connections, the closing tube and the drive shaft. An external influence is an excessive and unpredictable influence on things like tissue, surrounding bodies, and friction in the physical system. This external influence can be called drag, which acts in opposition to one of the electric motors 704a to 704e. External influence, such as drag, can cause the functioning of the physical system to deviate from a desired operation of the physical system. [00258] [00258] In one aspect, the position sensor 734 can be implemented as an absolute positioning system. In one aspect, the 734 position sensor may comprise an absolute rotary magnetic positioning system implemented as a single integrated circuit rotary magnetic position sensor ASSOSSEQFT, available from Austria Microsystems, AG. The position sensor 734 can interface with the control circuit 710 to provide an absolute positioning system. The position can include multiple Hall effect elements located above a magnet and coupled to a CORDIC processor, also known as the digit by digit method and Volder algorithm, which is provided to implement a simple and efficient algorithm for calculating hyperbolic and trigonometric functions which only require addition, subtraction, bit shift and lookup table operations. [00259] [00259] In one aspect, the control circuit 710 may be in communication with one or more sensors 738. The sensors 738 can be positioned on the end actuator 702 and adapted to work with the robotic surgical instrument 700 to measure various derived parameters such as span distance in relation to time, compression of the tissue in relation to time, and deformation of the anvil in relation to time. The 738 sensors can comprise a magnetic sensor, a magnetic field sensor, a strain gauge, a load cell, a pressure sensor, a force sensor, a torque sensor, an inductive sensor such as an eddy current sensor, a resistive sensor, a capacitive sensor, an optical sensor and / or any other suitable sensor for measuring one or more parameters of the end actuator [00260] [00260] In one aspect, the one or more sensors 738 may comprise a stress meter such as, for example, a microstrain meter, configured to measure the magnitude of the stress on the anvil 716 during a clamped condition. The voltage meter provides an electrical signal whose amplitude varies with the magnitude of the voltage. Sensors 738 can comprise a pressure sensor configured to detect a pressure generated by the presence of compressed tissue between the anvil 716 and the staple cartridge 718. Sensors 738 can be configured to detect the impedance of a section of tissue located between the anvil 716 and the staple cartridge 718 which is indicative of the thickness and / or completeness of the fabric located between them. [00261] [00261] In one aspect, the 738 sensors can be implemented as one or more limit switches, electromechanical devices, solid state switches, Hall effect devices, magneto-resistive devices (MR) giant magneto-resistive devices (GMR), magnetometers , among others. In other implementations, the 738 sensors can be implemented as solid state switches that operate under the influence of light, such as optical sensors, infrared sensors, ultraviolet sensors, among others. In addition, the switches can be solid state devices such as transistors (for example, FET, junction FET, MOSFET, bipolar, and the like). In other implementations, the 738 sensors can include driverless electric switches, ultrasonic switches, accelerometers and inertia sensors, among others. [00262] [00262] In one aspect, sensors 738 can be configured to measure the forces exerted on the anvil 716 by the closing drive system. For example, one or more sensors 738 may be at a point of interaction between the closing tube and the anvil 716 to detect the closing forces applied by the closing tube to the anvil 716. The forces exerted on the anvil 716 may be representative of the tissue compression experienced by the tissue section captured between the anvil 716 and the staple cartridge [00263] [00263] In one aspect, a current sensor 736 can be used to measure the current drained by each of the 704a to 704e motors. The force required to advance any of the moving mechanical elements, such as the beam with a profile | 714, corresponds to the current drained by one of the motors 704a to 704e. The force is converted into a digital signal and supplied to control circuit 710. Control circuit 710 can be configured to simulate the response of the instrument's actual system in the controller software. A displacement member can be actuated to move a beam with a profile in | 714 on end actuator 702 at or near a target speed. The robotic surgical instrument 700 may include a feedback controller, which may be one or any of the feedback controllers, including, but not limited to, a PID controller, state feedback, linear quadratic (LQOR) and / or an adaptive controller , for example. The robotic surgical instrument 700 can include a power source to convert the signal from the feedback controller to a physical input such as case voltage, PWM voltage, frequency modulated voltage, current, torque and / or force, for example. Additional details are revealed in US patent application serial number 15 / 636,829, entitled [00264] [00264] Figure 18 illustrates a block diagram of a surgical instrument 750 programmed to control the distal translation of a displacement member according to an aspect of the present description. In one aspect, the 750 surgical instrument is programmed to control the distal translation of a displacement member, such as the beam with a | 764. The surgical instrument 750 comprises an end actuator 752 which may comprise an anvil 766, a beam with a profile in | 764 (including a sharp cutting edge) and a removable staple cartridge 768. [00265] [00265] The position, movement, displacement and / or translation of a member of linear displacement, such as the beam with profile in | 764, can be measured by an absolute positioning system, a sensor arrangement and a position sensor 784. As the beam with profile in | 764 is coupled to a longitudinally movable drive member, the position of the beam with profile in | 764 can be determined by measuring the position of the longitudinally movable drive member using the position sensor 784. Consequently, in the following description, the position, displacement and / or translation of the beam with profile in | 764 can be obtained by the position sensor 784, as described in the present invention. A control circuit 760 can be programmed to control the translation of the displacement member, such as the beam with | 764. The control circuit 760, in some examples, may comprise one or more microcontrollers, microprocessors or other suitable processors to execute the instructions that cause the processor or processors to control the displacement member, for example, the profile beam in | 764, as described. In one aspect, a timer / counter 781 provides an output signal, such as elapsed time or a digital count, to the control circuit 760 to correlate the position of the beam with | 764, as determined by the position sensor 784 with the timer / counter output 781, so that the control circuit 760 can determine the position of the beam with | 764 at a specific time (t) in relation to an initial position. The 781 timer / counter can be configured to measure elapsed time, count external events, or measure external events. [00266] [00266] Control circuit 760 can generate a 772 motor setpoint signal. The 772 motor setpoint signal can be supplied to a 758 motor controller. The 758 motor controller can comprise one or more circuits configured to provide a motor 774 drive signal to motor 754 to drive motor 754, as described in the present invention. In some instances, the 754 motor may be a DC motor with a brushed DC electric motor. For example, the speed of motor 754 can be proportional to the drive signal of motor 774. In some instances, motor 754 can be a brushless DC electric motor and the motor drive signal 774 can comprise a PWM signal provided for a or more motor stator windings 754. In addition, in some examples, motor controller 758 may be omitted, and control circuit 760 can generate motor drive signal 774 directly. [00267] [00267] The 754 motor can receive power from a power source [00268] [00268] —Orcircuitode control 760 can be in communication with one or more sensors 788. The sensors 788 can be positioned on the end actuator 752 and adapted to work with theThe surgical instrument 750 to measure the various derived parameters, such as span distance in relation to time, compression of the tissue in relation to time and mechanical tension of the anvil in relation to time. The 788 sensors can comprise a magnetic sensor, a magnetic field sensor, a stress meter, a pressure sensor, a force sensor, an inductive sensor such as a eddy current sensor, a resistive sensor, a capacitive sensor, a sensor optical and / or any other sensors suitable for measuring one or more parameters of the 752 end actuator. The 788 sensors may include one or more sensors. [00269] [00269] The one or more sensors 788 may comprise an effort meter, such as a microstrain meter, configured to measure the magnitude of the mechanical stress on the anvil 766 during a grip condition. The voltage meter provides an electrical signal whose amplitude varies with the magnitude of the voltage. The 788 sensors can comprise a pressure sensor configured to detect a pressure generated by the presence of compressed tissue between the anvil 766 and the staple cartridge 768. The 788 sensors can be configured to detect the impedance of a section of tissue located between the anvil 766 and the staple cartridge 768 which is indicative of the thickness and / or completeness of the fabric located between them. [00270] [00270] The 788 sensors can be configured to measure the forces exerted on the anvil 766 by a closing drive system. For example, one or more sensors 788 can be at a point of interaction between a closing tube and anvil 766 to detect the closing forces applied by a closing tube to anvil 766. The forces exerted on anvil 766 can be representative of the tissue compression experienced by the tissue section captured between the anvil 766 and the staple cartridge 768. The one or more sensors 788 can be positioned at various points of interaction throughout the closing drive system to detect the closing forces applied anvil 766 by the closing drive system. The one or more 788 sensors can be sampled in real time during a gripping operation by a processor from the control circuit 760. The control circuit 760 receives sample measurements in real time to provide and analyze information based on time and evaluate, in real time, the closing forces applied to the anvil 766. [00271] [00271] A current sensor 786 can be used to measure the current drained by the 754 motor. The force required to advance the beam with profile in | 764 corresponds to the current drained by the motor [00273] [00273] The actual drive system of the surgical instrument 750 is configured to drive the displacement member, the cutting member or the beam with profile in | 764, through a brushed DC motor with gearbox and mechanical connections with a hinge and / or knife system. Another example is the 754 electric motor that operates the displacement member and the articulation drive, for example, from an interchangeable drive shaft assembly. An external influence is an excessive and unpredictable influence on things like tissue, surrounding bodies, and friction in the physical system. This external influence can be called drag, which acts in opposition to the 754 electric motor. External influence, like drag, can cause the functioning of the physical system to deviate from a desired operation of the physical system. [00274] [00274] Several exemplifying aspects are directed to a 750 surgical instrument that comprises a 752 end actuator with motor-driven surgical stapling and cutting implements. For example, a motor 754 can drive a displacement member distally and proximally along a longitudinal geometry axis of end actuator 752. End actuator 752 may comprise an articulating anvil 766 and, when configured for use, a staples 768 positioned opposite the anvil 766. A doctor can hold the tissue between the anvil 766 and the staple cartridge 768, as described in the present invention. When ready to use the 750 instrument, the physician can provide a trigger signal, for example, by pressing a trigger on the instrument [00275] [00275] In several examples, the surgical instrument 750 may comprise a control circuit 760 programmed to control the distal translation of the displacement member, such as the beam with profile in | 764, for example, based on one or more tissue conditions. The control circuit 760 can be programmed to directly or indirectly detect tissue conditions, such as thickness, as described here. Control circuit 760 can be programmed to select a trigger control program based on tissue conditions. A trigger control program can describe the distal movement of the displacement member. Different trigger control programs can be selected to better treat different tissue conditions. For example, when a thicker tissue is present, control circuit 760 can be programmed to translate the displacement member at a lower speed and / or with a lower power. When a thinner fabric is present, the control circuit 760 can be programmed to move the displacement member at a higher speed and / or with greater power. [00276] [00276] In some examples, control circuit 760 may initially operate motor 754 in an open circuit configuration for a first open circuit portion of a travel of the displacement member. Based on an instrument response 750 during the open circuit portion of the course, control circuit 760 can select a trip control program. The response of the instrument may include a travel distance of the displacement member during the open circuit portion, a time elapsed during the open circuit portion, the power supplied to the motor 754 during the open circuit portion, a sum of pulse widths a motor start signal, etc. After the open circuit portion, control circuit 760 can implement the selected trigger control program for a second portion of the travel member travel. For example, during the closed loop portion of the stroke, control circuit 760 can modulate motor 754 based on translation data that describes a position of the displacement member in a closed circuit manner to translate the displacement member into a constant speed. Additional details are revealed in US patent application serial number 15 / 720,852, entitled SYSTEM AND METHODS FOR CONTROLLING A DISPLAY OF A SURGICAL INSTRUMENT, filed on September 29, 2017, which is hereby incorporated by reference in its entirety. [00277] [00277] Figure 19 is a schematic diagram of a 790 surgical instrument configured to control various functions in accordance with an aspect of the present description. In one aspect, the 790 surgical instrument is programmed to control the distal translation of a displacement member, such as the beam with a | 764. The surgical instrument 790 comprises an end actuator 792 which may comprise an anvil 766, a beam with a profile | 764 and a removable staple cartridge 768 that can be interchanged with an RF cartridge 796 (shown in dashed line). [00278] [00278] In one aspect, the 788 sensors can be implemented as a limit switch, electromechanical device, solid state switches, Hall effect devices, MRI devices, GMR devices, magnetometers, among others. In other implementations, 638 sensors can be solid state switches that operate under the influence of light, such as optical sensors, infrared sensors, ultraviolet sensors, among others. In addition, the switches can be solid state devices such as transistors (for example, FET, junction FET, MOSFET, bipolar, and the like). In other implementations, 788 sensors can include driverless electric switches, ultrasonic switches, [00279] [00279] In one aspect, the position sensor 784 can be implemented as an absolute positioning system, which comprises a rotating magnetic absolute positioning system implemented as a single integrated circuit rotating magnetic position sensor ASSOSSEQFT, available from Austria Microsystems , AG. The position sensor 784 can interface with the control circuit 760 to provide an absolute positioning system. The position can include multiple Hall effect elements located above a magnet and coupled to a CORDIC processor, also known as the digit by digit method and Volder algorithm, which is provided to implement a simple and efficient algorithm for calculating hyperbolic and trigonometric functions which only require addition, subtraction, bit shift and lookup table operations. [00280] [00280] In one aspect, the beam with profile in | 764 can be implemented as a knife member comprising a knife body which operationally supports a tissue cutting blade therein and may further include flaps or anvil engaging features and channel engaging features or a pedal. In one aspect, the staple cartridge 768 can be implemented as a standard (mechanical) surgical clamp cartridge. In one aspect, the RF cartridge 796 can be implemented as an RF cartridge. These and other sensor provisions are described in Commonly Owned US Patent Application Serial No. 15 / 628,175, entitled TECHNIQUES [00281] [00281] The position, movement, displacement and / or translation of a member of linear displacement, such as the beam with profile in | 764, [00282] [00282] Control circuit 760 can generate a setpoint signal for motor 772. The setpoint signal for motor 772 can be supplied to a motor controller 758. Motor controller 758 can comprise one or more circuits configured to provide a motor 774 drive signal to motor 754 to drive the motor [00283] [00283] The 754 motor can receive power from an energy source [00284] [00284] —Orcircuitode control 760 can be in communication with one or more sensors 788. Sensors 788 can be positioned on the end actuator 792 and adapted to work with the surgical instrument 790 to measure the various derived parameters, such as span distance in relation to time, compression of the tissue in relation to time and mechanical tension of the anvil in relation to time. The 788 sensors can comprise a magnetic sensor, a magnetic field sensor, a stress meter, a pressure sensor, a force sensor, an inductive sensor such as a eddy current sensor, a resistive sensor, a capacitive sensor, a sensor optical and / or any other sensors suitable for measuring one or more parameters of the end actuator 792. The 788 sensors may include one or more sensors. [00285] [00285] The one or more sensors 788 may comprise an effort meter, such as a microstrain meter, configured to measure the magnitude of the mechanical stress on the anvil 766 during a grip condition. The voltage meter provides an electrical signal whose amplitude varies with the magnitude of the voltage. The 788 sensors can comprise a pressure sensor configured to detect a pressure generated by the presence of compressed tissue between the anvil 766 and the staple cartridge 768. The 788 sensors can be configured to detect the impedance of a section of tissue located between the anvil 766 and the staple cartridge 768 which is indicative of the thickness and / or completeness of the fabric located between them. [00286] [00286] The 788 sensors can be configured to measure the forces exerted on the anvil 766 by the closing drive system. For example, one or more sensors 788 can be at a point of interaction between a closing tube and anvil 766 to detect the closing forces applied by a closing tube to anvil 766. The forces exerted on anvil 766 can be representative of the tissue compression experienced by the tissue section captured between the anvil 766 and the staple cartridge 768. The one or more sensors 788 can be positioned at various points of interaction throughout the closing drive system to detect the closing forces applied anvil 766 by the closing drive system. The one or more sensors 788 can be sampled in real time during a gripping operation by a processor portion of the control circuit 760. The control circuit 760 receives sample measurements in real time to provide and analyze information based on time and evaluate, in real time, the closing forces applied to the anvil 766. [00287] [00287] A current sensor 786 can be used to measure the current drained by the 754 motor. The force required to advance the beam with profile in | 764 corresponds to the current drained by the motor [00288] [00288] An RF power source 794 is coupled to the end actuator 792 and is applied to the RF 796 cartridge when the RF 796 cartridge is loaded on the end actuator 792 instead of the staple cartridge 768. The control circuit 760 controls the supply of RF energy to the 796 RF cartridge. [00289] [00289] Additional details are revealed in US patent application serial number 15 / 636,096, entitled SURGICAL SYSTEM COUPLABLE WITH STAPLE CARTRIDGE AND RADIO FREQUENCY [00290] [00290] Figure 20 is a simplified block diagram of a generator 800 configured to provide adjustment without inductor, among other benefits. Additional details of generator 800 are described in US patent No. 9,060,775, entitled SURGICAL GENERATOR FOR ULTRASONIC AND ELECTROSURGICAL DEVICES, issued on June 23, 2015, which is hereby incorporated by reference in its entirety. The generator 800 can comprise a patient isolated stage 802 in communication with a non-isolated stage 804 via a power transformer 806. A secondary winding 808 of the power transformer 806 is contained in the isolated stage 802 and can comprise a bypass configuration. (for example, a central or non-central bypass configuration) to define trigger signal outputs 810a, 810b, 810c to provide trigger signals to different surgical instruments, such as an ultrasonic surgical device and an instrument electrosurgical RF, and a multifunctional surgical instrument that includes ultrasonic and RF energy modes that can be released alone or simultaneously. In particular, the trigger signal outputs 810a and 810c can provide an ultrasonic trigger signal [for example, a 420 V root-mean-square trigger signal] to a surgical instrument ultrasonic, and the trigger signal outputs 810b and 810c can provide an electrosurgical trigger signal (for example, a 100 V RMS trigger signal) to an RF electrosurgical instrument, with the trigger signal output 810b corresponding to the central tap of the 806 power transformer. [00291] [00291] In certain forms, ultrasonic and electrosurgical trigger signals can be supplied simultaneously to different surgical instruments and / or to a single surgical instrument, such as the multifunctional surgical instrument, which has the capacity to supply both ultrasonic and electrosurgical energy to the tissue . It will be noted that the electrosurgical signal provided by both the electrosurgical instrument - dedicated - and by the “combined electrosurgical / ultrasonic multifunctional instrument can be both a therapeutic and subtherapeutic level signal, where the subtherapeutic signal can be used, for example, to monitor tissue or instrument conditions and provide feedback to the generator. For example, RF and ultrasonic signals can be supplied separately or simultaneously from a generator with a single output port in order to provide the desired output signal to the surgical instrument, as - will be discussed in more detail below. Consequently, the generator can combine the RF and ultrasonic electrosurgical energies and supply the combined energies to the multifunctional electrosurgical / ultrasonic instrument. Bipolar electrodes can be placed on one or both claws of the end actuator. A claw can be triggered by ultrasonic energy in addition to RF electrosurgical energy, working simultaneously. Ultrasonic energy can be used to dissect tissue, while RF electrosurgical energy can be used to cauterize vessels. [00292] [00292] “The non-isolated stage 804 may comprise a power amplifier 812 that has an output connected to a primary winding 814 of the power transformer 806. In certain forms the power amplifier 812 may comprise a push-pull amplifier. For example, the non-isolated stage 804 can further comprise a logic device 816 for providing a digital output to a digital-to-analog converter (DAC) circuit 818 which, in turn, provides an analog signal corresponding to an input from the power amplifier 812. In certain ways, the logic device 816 may comprise a programmable gate array ("PGA"), a field-programmable gate array (FPGA), a programmable logic device ( "PLD" - programmable logic device) among other logic circuits, for example. The logic device 816, because it controls the input of the power amplifier 812 through the DAC circuit 818, can therefore control any of several parameters (for example, frequency, waveform, amplitude of the waveform) of that appear at the trigger signal outputs 810a, 810b and 810c. In certain ways and as discussed below, logic device 816, in conjunction with a processor (for example, a PSD discussed below), can implement various PSD-based algorithms and / or other control algorithms for drive signal control parameters provided by generator 800. [00293] [00293] Power can be supplied to a power rail of the power amplifier 812 by a key mode regulator 820, for example, a power converter. In certain forms, the key mode regulator 820 may comprise an adjustable antagonistic regulator, for example. Non-isolated stage 804 may further comprise a first processor 822 which, in one form, may comprise a PSD processor, such as an analog device ADSP-21469 SHARC PSD, available from Analog Devices, Norwood, MA, USA, for example, although any suitable processor can be used in several ways. In certain ways, the PSD processor 822 can control the operation of the key mode regulator 820 responsive to voltage feedback information received from the power amplifier 812 by the PSD processor 822 via an ADC 824 circuit. For example, the PSD 822 processor can receive the waveform envelope of a signal (for example, an RF signal) as input, which is amplified by the power amplifier 812, via the ADC 824 circuit. The PSD 822 processor you can then control the key mode regulator 820 (for example, via a PWM output), so that the rail voltage supplied to the power amplifier 812 follows the waveform envelope of the amplified signal. By dynamically modulating the rail voltage of the power amplifier 812 based on the waveform envelope, the efficiency of the power amplifier 812 can be significantly improved over amplifier schemes with fixed rail voltage. [00294] [00294] In certain ways, the logical device 816, in conjunction with the PSD 822 processor, can implement a digital synthesis circuit as a control scheme with direct digital synthesizer to control the waveform, frequency and / or amplitude of the trigger signals emitted by the generator 800. In one way, for example, the logic device 816 can implement a DDS control algorithm by retrieving waveform samples stored in a lookup table ("LUT" - look -up table) dynamically updated, like a RAM LUT that can be integrated into an FPGA. This control algorithm is particularly useful for ultrasonic applications in which an ultrasonic transducer, such as an ultrasonic transducer, can be driven by a clean sinusoidal current at its resonant frequency. Since other frequencies can excite parasitic resonances, minimizing or reducing the total distortion of the branching current can correspondingly minimize or reduce the undesirable effects of the resonance. As the waveform of a drive signal output by generator 800 is impacted by various sources of distortion present in the output drive circuit (for example, power transformer 806, power amplifier 812), feedback data over voltage and current based on the trigger signal can be provided to an algorithm, such as an error control algorithm implemented by the PSD 822 processor, which compensates for the distortion by pre-distortion or proper modification of the waveform samples stored in the LUT dynamically and continuously (for example, in real time). In one way, the amount or degree of pre-distortion applied to the LUT samples can be based on the error between a current from the computerized motion branch and a desired current waveform, the error being determined on a basis of sample by sample. In this way, pre-distorted LUT samples, when processed through the drive circuit, can result in a motion branch drive signal that has the desired waveform (for example, sinusoidal) to optimally drive the transducer ultrasonic. In such forms, the LUT waveform samples will therefore not represent the desired waveform of the trigger signal, but rather the waveform that is needed to ultimately produce the desired waveform of the trigger signal of the movement branch, when the distortion effects are taken into account. [00295] [00295] The non-isolated stage 804 may further comprise a first converter circuit AD 826 and a second circuit converter AD 828 coupled to the output of the power transformer 806 by means of the respective isolation transformers 830 and 832 to respectively sample the voltage and the current of drive signals emitted by the generator 800. In certain ways, the AD 826 and 828 converter circuits can be configured for sampling at high speeds [for example, 80 mega samples per second ("MSPS" - mega samples per second) ] to allow oversampling of the trigger signals. In one way, for example, the sampling speed of the A-D converter circuits 826 and 828 can allow an oversampling of approximately 200x (depending on the frequency) of the drive signals. In certain ways, the sampling operations of the A-D converter circuit 826 and 828 can be performed by a single A-D converter circuit that receives input voltage and current signals through a bidirectional multiplexer. The use of high-speed sampling in the forms of generator 800 can allow, among other things, the calculation of the complex current flowing through the branch of motion (which can be used in certain ways to implement DDS-based waveform control described above), the exact digital filtering of the sampled signals and the calculation of actual energy consumption with a high degree of accuracy. The feedback data about voltage and current emitted by the AD 826 and 828 converter circuits can be received and processed [for example, first-in-first-out temporary storage ("FIFO" - firstin-first-out, multiplexer] by logic device 816 and stored in data memory for subsequent retrieval, for example, by the processor [00296] [00296] In certain forms, feedback data about voltage and current can be used to control the frequency and / or amplitude (for example, current amplitude) of the trigger signals. In one way, for example, feedback data about voltage and current can be used to determine the impedance phase. The frequency of the trigger signal can then be controlled to minimize or reduce the difference between the determined impedance phase and an impedance phase setpoint (eg 0º), thereby minimizing or reducing the effects of harmonic distortion and, correspondingly, accentuating the accuracy of the impedance phase measurement. The determination of the phase impedance and a frequency control signal can be implemented in the PSD 822 processor, for example, with the frequency control signal being supplied as input to a DDS control algorithm implemented by the logic device 816. [00297] [00297] In another form, for example, the current feedback data can be monitored in order to maintain the current amplitude of the drive signal at a current amplitude setpoint. The current amplitude set point can be specified directly or indirectly determined based on the specified set points for voltage and power amplitude. In certain ways, the control of the current amplitude can be implemented by the control algorithm, such as, for example, a proportional-integral-derivative control algorithm (PID), in the PSD 822 processor. The variables controlled by the control algorithm for properly controlling the current amplitude of the drive signal may include, for example, scaling the LUT waveform samples stored in logic device 816 and / or the full-scale output voltage of the DAC 818 circuit (which provides input to the power amplifier 812) via a DAC 834 circuit. [00298] [00298] The non-isolated stage 804 may further comprise a second processor 836 to provide, among other things, the user interface (UI) functionality. In one form, the UI 836 processor can comprise an Atmel AT91SAM9263 processor with an ARM 926EJ-S core, available from Atmel Corporation, of San Jose, CA, USA, for example. Examples of UI functionality supported by the UI 836 processor may include audible and visual feedback from the user, communication with peripheral devices (for example, via a USB interface), communication with the foot switch, communication with an input device ( for example, a touchscreen) and communication with an output device (for example, a speaker). The UI processor 836 can communicate with the PSD processor 822 and the logical device 816 (for example, via SPI buses). Although the UIl 836 processor can primarily support UI functionality, it can also coordinate with the PSD 822 processor to implement risk mitigation in certain ways. For example, the UI 836 processor can be programmed to monitor various aspects of user actions and / or other inputs (for example, touchscreen inputs, foot switch inputs, temperature sensor inputs) and can disable the drive output of generator 800 when an error condition is detected. [00299] [00299] In certain ways, both the PSD 822 processor and the UIl 836 processor can, for example, determine and monitor the operational status of generator 800. For the PSD 822 processor, the operational state of generator 800 can determine, for example, which control and / or diagnostic processes are implemented by the PSD 822 processor. For the UI 836 processor, the operational state of generator 800 can determine, for example, which elements of a UI (for example, display screens , sounds) are presented to a user. The respective UI and PSD processors 822 and 836 can independently maintain the current operational state of the generator 800 and recognize and evaluate possible transitions outside the current operational state. The PSD 822 processor can act as the master in this relationship and can determine when transitions between operational states should occur. The UI 836 processor can be aware of valid transitions between operational states and can confirm that a particular transition is adequate. For example, when the PSD 822 processor instructs the UI 836 processor to transition to a specific state, the UI 836 processor can verify that the requested transition is valid. If a requested transition between states is determined to be invalid by the UI 836 processor, the UI 836 processor can cause generator 800 to enter a fault mode. [00300] [00300] The non-isolated platform 804 can also comprise a controller 838 for monitoring input devices (for example, a capacitive touch sensor used to turn the generator 800 on and off, a capacitive touch screen). In certain forms, controller 838 may comprise at least one processor and / or other controller device in communication with the UI processor [00301] [00301] In certain ways, when generator 800 is in an "off" state, controller 838 can continue to receive operational power (for example, through a line from a generator 800 power supply, such as the power supply 854 discussed below). In this way, the 838 controller can continue to monitor an input device (for example, a capacitive touch sensor located on a front panel of the generator 800) to turn the generator on and off [00302] [00302] In certain forms, controller 838 may cause generator 800 to provide audible feedback or other sensory feedback to alert the user that an on or off sequence has been initiated. This type of alert can be provided at the beginning of an on or off sequence, and before the start of other processes associated with the sequence. [00303] [00303] In certain forms, the isolated stage 802 may comprise an instrument interface circuit 840 to, for example, provide a communication interface between a control circuit of a surgical instrument (e.g., a control circuit comprising switches handle) and non-isolated stage components 804, such as logic device 816, PSD processor 822 and / or UI processor 836. Instrument interface circuit 840 can exchange information with non-stage components isolated 804 by means of a communication link that maintains an adequate degree of electrical isolation between the isolated and non-isolated stages 802 and 804, such as, for example, an IR-based communication link. Power can be supplied to the instrument interface circuit 840 using, for example, a low-drop voltage regulator powered by an isolation transformer driven from the non-isolated stage 804. [00304] [00304] In one form, the instrument interface circuit 840 may comprise a logic circuit 842 (e.g., a logic circuit, a programmable logic circuit, PGA, FPGA, PLD) in communication with a signal conditioning circuit 844. Signal conditioning circuit 844 can be configured to receive a periodic signal from logic circuit 842 (e.g., a 2 kHz square wave) to generate a bipolar interrogation signal that has an identical frequency. The question mark can be generated, for example, using a bipolar current source powered by a differential amplifier. The question mark can be communicated to a surgical instrument control circuit (for example, using a conductive pair on a cable that connects the generator 800 to the surgical instrument) and monitored to determine a state or configuration of the control circuit. control. The control circuit may comprise numerous switches, resistors and / or diodes to modify one or more characteristics (for example, amplitude, [00305] [00305] In one form, the instrument interface circuit 840 may comprise a first data circuit interface 846 to enable the exchange of information between logic circuit 842 (or another element of the instrument interface circuit 840) and a first data circuit disposed in a surgical instrument or otherwise associated with it. In certain forms, for example, a first data circuit may be arranged on a cable integrally attached to a handle of the surgical instrument or on an adapter to interface between a specific type or model of surgical instrument and the generator 800. The first The data circuit can be deployed in any suitable manner and can communicate with the generator in accordance with any suitable protocol, including, for example, as described here with respect to the first data circuit. In certain forms, the first data circuit may comprise a non-volatile storage device, such as an EEPROM device. In certain ways, the first data circuit interface 846 can be implemented separately from logic circuit 842 and comprises a suitable circuit (for example, separate logic devices, a processor) to allow communication between logic circuit 842 and the first circuit of Dice. In other forms, the first data circuit interface 846 can be integral with logic circuit 842. [00306] [00306] In certain forms, the first data circuit can store information related to the specific surgical instrument with which it is associated. This information may include, for example, a model number, a serial number, a number of operations in which the surgical instrument was used, and / or any other types of information. This information can be read by the instrument interface circuit 840 (for example, the logic circuit 842), transferred to a non-isolated stage component 804 (for example, to the logic device 816, PSD processor 822 and / or processor UI 836) for presentation to a user by means of an output device and / or to control a function or operation of the generator 800. In addition, any type of information can be communicated to the first data circuit for storage through the first interface of data circuit 846 (for example, using logic circuit 842). This information may include, for example, an updated number of operations in which the surgical instrument was used and / or the dates and / or times of its use. [00307] [00307] As previously discussed, a surgical instrument can be removable from a handle (for example, the multifunctional surgical instrument can be removable from the handle) to promote interchangeability and / or disposability of the instrument. In such cases, conventional generators may be limited in their ability to recognize specific instrument configurations being used, as well as to optimize the control and diagnostic processes as needed. The addition of readable data circuits to surgical instruments to address this issue is problematic from a compatibility point of view, however. For example, designing a surgical instrument so that it remains backward compatible with generators that lack the indispensable data reading functionality may be impractical due, for example, to different signaling schemes, design complexity and cost. The forms of instruments discussed here address these concerns through the use of data circuits that can be implemented in existing surgical instruments, economically and with minimal design changes to preserve the compatibility of surgical instruments with current generator platforms. [00308] [00308] Additionally, the forms of the generator 800 can enable communication with instrument-based data circuits. For example, generator 800 can be configured to communicate with a second data circuit contained in an instrument (for example, the multifunctional surgical device). In some ways, the second data circuit can be implemented in a manner similar to that of the first data circuit described here. The instrument interface circuit 840 may comprise a second data circuit interface 848 to enable such communication. In one form, the second data circuit interface 848 can comprise a three-state digital interface, although other interfaces can also be used. In certain ways, the second data circuit can generally be any circuit for transmitting and / or receiving data. In one form, for example, the second data circuit can store information related to the specific surgical instrument with which it is associated. This information may include, for example, a model number, a serial number, a number of operations in which the surgical instrument was used, and / or any other types of information. [00309] [00309] In some ways, the second data circuit can store information about the ultrasonic and / or electronic properties of an associated ultrasonic transducer, end actuator or ultrasonic drive system. For example, the first data circuit can indicate an initialization frequency slope, as described here. In addition or alternatively, any type of information can be communicated to the second data circuit for storage in it via the second data circuit interface 848 (for example, using logic circuit 842). This information may include, for example, an updated number of operations in which the surgical instrument was used and / or the dates and / or times of its use. In certain ways, the second data circuit can transmit data captured by one or more sensors (for example, an instrument-based temperature sensor). In certain ways, the second data circuit can receive data from generator 800 and provide an indication to a user (for example, a light-emitting indication or other visible indication) based on the received data. [00310] [00310] In certain ways, the second data circuit and the second data circuit interface 848 can be configured so that communication between logic circuit 842 and the second data circuit can be carried out without the need to provide additional conductors for this purpose (for example, dedicated conductors of a cable that connects a handle to the generator 800). In one way, for example, information can be communicated to and from the second data circuit using a wire bus communication scheme implemented in existing wiring, as one of the conductors used transmitting interrogation signals from the signal conditioning circuit 844 to a control circuit on a handle. In this way, changes or modifications to the design of the surgical device that may otherwise be necessary are minimized or reduced. In addition, due to the fact that different types of communications implemented on a common physical channel can be separated based on frequency, the presence of a second data circuit can be "invisible" to generators that do not have the essential functionality of reading data, which, therefore, allows the backward compatibility of the surgical instrument. [00311] [00311] In certain forms, the isolated stage 802 may comprise at least one blocking capacitor 850-1 connected to the trigger signal output 810b to prevent the passage of direct current to a patient. A single blocking capacitor may be required to comply with medical regulations and standards, for example. Although failures in single-capacitor designs are relatively uncommon, such failures can still have negative consequences. In one form, a second blocking capacitor 850-2 can be supplied in series with the blocking capacitor 850-1, with current dispersion of one point between the blocking capacitors 850-1 and 850-2 being monitored, for example , by an AD 852 converter circuit for sampling a voltage induced by leakage current. Samples can be received, for example, via logic circuit 842. Based on changes in the scattering current (as indicated by the voltage samples), generator 800 can determine when one of the 850-1 or 850-2 blocking capacitors has failed , thereby providing a benefit over single capacitor designs that have a single point of failure. [00312] [00312] In certain forms, the non-isolated stage 804 may comprise a power supply 854 for delivering direct current power with an adequate voltage and current. The power supply may comprise, for example, a 400 W power supply to deliver a system voltage of 48 VDC. The power supply 854 can further comprise one or more DC / DC voltage converters 856 to receive the power supply output in order to generate direct current outputs at the voltages and currents required by the various components of generator 800. As discussed above in With respect to controller 838, one or more of the 856 DC / DC voltage converters can receive an input from controller 838 when the activation of the "on / off" input device by a user is detected by controller 838, to enable operation or the activation of the 856 DC / DC voltage converters. [00313] [00313] Figure 21 illustrates an example of generator 900, which is a form of generator 800 (Figure 20). The 900 generator is configured to supply multiple types of energy to a surgical instrument. The 900 generator provides ultrasonic and RF signals to power a surgical instrument, independently or simultaneously. Ultrasonic and RF signals can be provided alone or in combination and can be provided simultaneously. As indicated above, at least one generator output can provide multiple types of energy (for example, ultrasonic, bipolar or monopolar RF, irreversible and / or reversible electroporation, and / or microwave energy, among others) through a single port, and these signals can be supplied separately or simultaneously to the end actuator to treat tissue. [00314] [00314] Generator 900 comprises a processor 902 coupled to a waveform generator 904. Processor 902 and waveform generator 904 are configured to generate various signal waveforms based on information stored in a coupled memory to processor 902, not shown for clarity of description. The digital information associated with a waveform is provided to the waveform generator 904 that includes one or more DAC circuits to convert the digital input to an analog output. The analog output is powered by an amplifier 1106 for signal conditioning and amplification. The conditioned and amplified output of the amplifier 906 is coupled to a power transformer 908. The signals are coupled via the power transformer 908 to the secondary side, which is on the patient isolation side. A first signal of a first energy modality is supplied to the surgical instrument between the terminals identified as ENERGY1 and RETURN. A second signal of a second energy modality is coupled through a capacitor 910 and is supplied to the surgical instrument between the terminals identified as ENERGY2 and RETURN. It will be recognized that more than two types of energy can be issued and, therefore, the subscript "n" can be used to designate that up to n ENERGY terminals can be provided, where n is a positive integer greater than 1. It will also be acknowledged that up to "n" return paths, RETURN can be provided without departing from the scope of this description. [00315] [00315] A first voltage detection circuit 912 is coupled through the terminals identified as ENERGY1 and the RETURN path to measure the output voltage between them. A second voltage detection circuit 924 is connected via the terminals identified as ENERGY and the RETURN path to measure the output voltage between them. A current detection circuit 914 is arranged in series with the RETURN leg on the secondary side of the power transformer 908 as shown to measure the output current for any energy modality. If different return paths are provided for each energy modality, then a separate current detection circuit would be provided on each return leg. The outputs of the first and second voltage detection circuits 912, 924 are supplied to the respective isolation transformers 916, 922 and the output of the current detection circuit 914 is supplied to another isolation transformer 918. The outputs of the isolation transformers 916 , 928, 922 on the primary side of the power transformer 908 (non-isolated side of the patient) are supplied to one or more ADC 926 circuits. The digitized output from the ADC 926 circuit is provided to processor 902 for further processing and computation. The output voltages and the output current feedback information can be used to adjust the output voltage and the current supplied to the surgical instrument, and to compute the output impedance, among other parameters. Input / output communications between the 902 processor and the patient's isolated circuits are provided via an interface circuit [00316] [00316] In one aspect, impedance can be determined by processor 902 by dividing the output of the first voltage detection circuit 912 coupled through the terminals identified as ENERGY1 / RETURN or the second voltage detection circuit 924 coupled through the terminals identified as ENERGY2 / RETURN, by the output of the current detection circuit 914 arranged in series with the RETURN leg on the secondary side of the power transformer 908. The outputs of the first and second voltage detection circuits 912, 924 are provided to separate the transformer isolations 916, 922 and the output of the current detection circuit 914 is provided to another isolation transformer 916. Voltage and current detection measurements digitized from the AD 926 converter circuit are provided to processor 902 to compute the impedance. As an example, the first ENERGY1 energy modality can be ultrasonic energy and the second ENERGY 2 energy modality can be RF energy. However, in addition to the ultrasonic and bipolar or monopolar RF energy modalities, other energy modalities include irreversible and / or reversible electroporation and / or microwave energy, among others. Furthermore, although the example illustrated in Figure [00317] [00317] As shown in Figure 21, generator 900 comprising at least one output port may include a power transformer 908 with a single output and multiple taps to provide power in the form of one or more energy modalities, such as ultrasonic, bipolar or monopolar RF, irreversible and / or reversible electroporation, and / or microwave energy, among others, for example, to the end actuator depending on the type of tissue treatment that is performed. For example, the 900 generator can supply higher voltage and lower current power to drive an ultrasonic transducer, lower voltage and higher current to drive RF electrodes to seal the tissue or with a coagulation waveform for point clotting using electrosurgical electrodes Monopolar or bipolar RF. The output waveform of the generator 900 can be oriented, switched or filtered to supply the frequency to the end actuator of the surgical instrument. The connection of an ultrasonic transducer to the output of generator 900 would preferably be located between the output identified as ENERGY1 and RETURN, as shown in Figure 21. In one example, a connection of bipolar RF electrodes to the generator output 900 would preferably be located between the exit identified as ENERGY2 and the RETURN. In the case of a monopolar output, would the preferred connections be an active electrode (for example, a light beam or another probe) for the ENERGY output and a suitable return block connected to the RETURN output. [00318] [00318] Additional details are revealed in US patent application publication No. 2017/0086914 entitled TECHNIQUES FOR OPERATING [00319] [00319] As used throughout this description, the term "wireless" and its derivatives can be used to describe circuits, devices, systems, methods, techniques, communication channels etc., which can communicate data through the use of electromagnetic radiation modulated using a non-solid medium. The term does not imply that the associated devices do not contain any wires, although in some ways they may not. The communication module can implement any of a number of wireless and wired communication standards or protocols, including, but not limited to, Wi-Fi (IEEE 802.11 family), WiMAX (IEEE 802.16 family), IEEE 802.20, evolution long-term evolution (LTE), Ev-DO, HSPA +, HSDPA +, HSUPA +, EDGE, GSM, GPRS, CDMA, TDMA, DECT, Bluetooth, Ethernet derivatives thereof, as well as any other protocols without wired and wired which are designated as 3G, 4G, 5G, and beyond. The computing module can include a plurality of communication modules. For example, a first communication module can be dedicated to short-range wireless communications like Wi-Fi and Bluetooth, and a second communication module can be dedicated to longer-range wireless communications like GPS, EDGE, GPRS, CDMA , WiMAX, LTE, Ev-DO, and others. [00320] [00320] As used in the present invention a processor or processing unit is an electronic circuit that performs operations on some external data source, usually memory or some other data flow. The term is used in the present invention to refer to the central processor (central processing unit) in a computer system or systems (specifically systems on a chip (SoCs)) that combine several specialized "processors". [00321] [00321] As used here, a system on a chip or system on the chip (SoC or SOC) is an integrated circuit (also known as an "IC" or "chip") that integrates all components of a computer or other electronic systems . It can contain digital, analog, mixed and often radio frequency functions - all on a single substrate. A SoC integrates a microcontroller (or microprocessor) with advanced peripherals such as a graphics processing unit (GPU), i-Fi module, or coprocessor. An SoC may or may not contain internal memory. [00322] [00322] As used here, a microcontroller or controller is a system that integrates a microprocessor with peripheral circuits and memory. A microcontroller (or MCU for microcontroller unit) can be implemented as a small computer on a single integrated circuit. It can be similar to a SoC; a SoC can include a microcontroller as one of its components. A microcontroller can contain one or more core processing units (CPUs) along with memory and programmable input / output peripherals. Program memory in the form of ferroelectric RAM, NOR flash or OTP ROM is also often included on the chip, as well as a small amount of RAM. Microcontrollers can be used for integrated applications, in contrast to microprocessors used in personal computers or other general purpose applications that consist of several separate integrated circuits. [00323] [00323] “As used in the present invention, the term controller or microcontroller can be an independent chip or IC (integrated circuit) device that interfaces with a peripheral device. This can be a connection between two parts of a computer or a controller on an external device that manages the operation of (and connection to) that device. [00324] [00324] “Any of the processors or microcontrollers in the present invention can be any implemented by any single-core or multi-core processor, such as those known under the trade name ARM Cortex by Texas Instruments. In one respect, the processor may be a Core Cortex-M4F LM4F230H5QR ARM processor, available from Texas Instruments, for example, which comprises an integrated 256 KB single-cycle flash memory, or other non-volatile memory, up to 40 MHz , a seek-ahead buffer to optimize performance above 40 MHz, a 32 KB single cycle serial random access memory (SRAM), an internal read-only memory (ROM) loaded with the StellarisWareO program, read-only memory programmable and electrically erasable (EEPROM) of 2 KB, one or more pulse width modulation (PWM) modules, one or more analogs of quadrature encoder (QEI) inputs, one or more analog to digital converters (ADC) of 12 bits with 12 analog input channels, details of which are available for the product data sheet. [00325] [00325] In one aspect, the processor may comprise a safety controller that comprises two controller-based families, such as TMS570 and RMA4x, known under the trade name Hercules ARM Cortex R4, also by Texas Instruments. The safety controller can be configured specifically for IEC 61508 and ISO 26262 safety critical applications, among others, to provide advanced integrated safety features while providing scalable performance, connectivity and memory options. [00326] [00326] The modular devices include the modules (as described in connection with Figures 3 and 9, for example) that are receivable within a central surgical controller and the devices or surgical instruments that can be connected to the various modules in order to connect or pair with the corresponding central surgical controller. Modular devices include, for example, smart surgical instruments, medical imaging devices, suction / irrigation devices, smoke evacuators, power generators, fans, insufflators and displays. The modular devices described here can be controlled by control algorithms. Control algorithms can be run on the modular device itself, on the central surgical controller to which the specific modular device is paired, or on both the modular device and the central surgical controller (for example, through a distributed computing architecture). exemplifications, the control algorithms of the modular devices control the devices based on the data detected by the modular device itself (that is, by sensors on, over or connected to the modular device). This data can be related to the patient being operated on (for example, tissue properties or insufflation pressure) or to the modular device itself (for example, the rate at which a knife is being advanced, the motor current, or the levels of energy). For example, a control algorithm for a surgical stapling and cutting instrument can control the rate at which the instrument's motor drives its knife through the fabric according to the resistance encountered by the knife as it progresses. Visualization systems [00327] [00327] During a surgical procedure, it may be necessary for a surgeon to manipulate tissues to achieve a desired medical result. The surgeon's actions are limited by what is visually observable at the surgical site. Thus, the surgeon may not be aware, for example, of the arrangement of vascular structures under the tissues being manipulated during the procedure. Since the surgeon is unable to view the vasculature under a surgical site, the surgeon may accidentally cut one or more blood vessels during the procedure. The solution is a surgical visualization system that can capture imaging data from the surgical site for presentation to a surgeon, whose presentation may include information related to the presence and depth of vascular structures located under the surface of a surgical site. [00328] [00328] In one aspect, the central surgical controller 106 incorporates a visualization system 108 for capturing imaging data during a surgical procedure. The visualization system 108 can include one or more light sources and one or more light sensors. One or more light sources and one or more light sensors may be incorporated together into a single device or may comprise one or more separate devices. One or more light sources can be directed to illuminate portions of the surgical field. The one or more image sensors can receive reflected or refracted light from the surgical field, including reflected or refracted light from tissue and / or surgical instruments. The following description includes all hardware and software processing techniques disclosed above and in those orders incorporated herein by reference as presented above. [00329] [00329] In some aspects, the visualization system 108 can be integrated with a surgical system 100 as described above and represented in Figures 1 and 2. In addition to the visualization system [00330] [00330] In some non-limiting examples, the imaging data generated by the visualization system 108 can be analyzed by embedded computational components of the visualization system 108, and the analysis results can be communicated to the centralized central surgical controller 106. In examples not alternative limiters, the imaging data generated by the visualization system 108 can be communicated directly to the centralized central surgical controller 106, and the data can be analyzed by computational components in the centralized controller system 106. The centralized central surgical controller 106 can communicate the results of image analysis to any one or more of the other components of the surgical system. In some other non-limiting examples, the centralized central surgical controller can communicate the image data and / or the results of the image analysis to the cloud computing system 104. [00331] [00331] Figures 22A to 22D and Figures 23A to 23F represent various aspects of an example of a 2108 display system that can be incorporated into a surgical system. The 2108 display system may include an image control unit 2002 and a manual unit 2020. The image control unit 2002 may include one or more light sources, a power source for one or more light sources, a or more types of data communication interfaces (including USB, Ethernet or wireless interfaces 2004) and one or more video outputs 2006. The Imaging Control Unit 2002 can also include an interface, such as a USB 2010 interface, configured for transmit integrated image and video capture data to a USB enabled device. The imaging control unit 2002 may also include one or more computational components including, without limitation, a processor unit, a transient memory unit, a non-transient memory unit, an image processing unit, a bus structure for form data connections between computational components and any interface devices (for example, input and / or output) needed to receive and transmit information to components not included in the imaging control unit. The non-transitory memory can also contain instructions that, when executed by the processor unit, can perform any number of data manipulations that can be received from the 2020 manual unit and / or from computational devices not included in the imaging control unit. [00332] [00332] —Lighting fonts may include a white light source [00333] [00333] In a non-limiting aspect, the 2020 manual unit can include a 2021 body, a scope cable with a 2015 camera attached to the 2021 body and a 2024 elongated camera probe. The 2021 body of the 2020 manual unit can include control buttons 2022 manual unit or other controls to enable a healthcare professional using the 2020 manual unit to control the operations of the 2020 manual unit or other components of the 2002 imaging control unit, including, for example, light sources. The 2015 camera scope cable may include one or more electrical conductors and one or more optical fibers. The 2015 camera scope cable can end with a 2008 camera head connector at a proximal end, and the 2008 camera head connector is configured to fit with one or more optical and / or electrical interfaces of the control unit 2002. Electrical conductors can supply power to the 2020 hand unit, which includes the 2021 body and the 2024 elongated camera probe, and / or any electrical components within the 2020 hand unit which include the 2021 body and / or the probe with 2024 elongated camera. The electrical conductors can also serve to provide bidirectional data communication between any one or more components of the 2020 manual unit and the 2002 imaging control unit. One or more optical fibers can lead to the illumination of one or more lighting sources in the 2002 imaging control unit through the body of the 2021 hand unit and up to a distal end of the 2024 elongated camera probe. non-limiting aspects, one or more optical fibers can also conduct reflected or refracted light from the surgical site to one or more optical sensors arranged in the 2024 elongated camera probe, in the body of the 2021 hand unit and / or in the 2002 imaging control unit . [00334] [00334] Figure 22B (a top plan view) represents in more detail some aspects of a 2020 manual unit of the 2108 display system. The body of the 2021 manual unit can be constructed of a plastic material. The control buttons on the 2022 hand unit or other controls can be molded with rubber overlay to protect the controls while allowing them to be manipulated by the surgeon. The 2015 camera scope cable may have optical fibers integrated with electrical conductors, and the 2015 camera scope cable may have a protective and flexible outer shell, such as PVC. In some non-limiting examples, the 2015 camera scope cable can be about 10 feet long to allow ease of use during a surgical procedure. The 2015 camera scope cable length can be in the range of about 5 feet to about feet. Non-limiting examples of a 2015 camera scope cable length may include about 5 feet, about 6 feet, about 7 feet, about 8 feet, about 9 feet, about 10 feet, about 11 feet, about 12 feet, about 13 feet, about 14 feet, about 15 feet or any length or range of lengths between them. The 2024 elongated camera probe can be manufactured from a rigid material such as stainless steel. In some respects, the 2024 elongated camera probe can be joined with the 2021 hand unit body using a 2026 rotating collar. The 2026 rotating collar can enable the 2024 elongated camera probe to be rotated relative to the 2021 hand unit body. In some respects, the 2024 elongated camera probe can end at a distal end with an epoxy-sealed 2028 plastic window. [00335] [00335] The side plan view of the manual unit, shown in Figure 22C, illustrates that a 2030 light or image sensor can be disposed at a distal end 2032a of the probe with the extended camera unit or inside the body of the 2032b manual unit . In some alternative aspects, the 2030 light or image sensor can be arranged with additional optical elements in the 2002 imaging control unit. Figure 22C further shows an example of a 2030 light sensor comprising a 2034 CMOS image sensor arranged inside of a 2036 bezel with a radius of about 4 mm. Figure 22D illustrates aspects of the CMOS image sensor 2034 representing the active area 2038 of the image sensor. Although the CMOS image sensor in Figure 22C is shown disposed within a 2036 bezel that has a radius of about 4 mm, it can be recognized that such a combination of sensor and bezel can be of any size useful to be arranged within the probe with 2024 elongated camera, 2021 manual unit or 2002 image control unit. Some non-limiting examples of such alternative bezels may include a 5.5mm bezel 2136a, a 4mm bezel 2136b, a 2.7mm bezel 2136c and a 2 mm 2136d bezel. It can be recognized that the image sensor may also comprise a CCD image sensor. The CMOS or CCD sensor can comprise an array of individual light sensing elements (pixels). [00336] [00336] Figures 23A to 23F represent various aspects of some examples of light sources and their control that can be incorporated into the 2108 visualization system. [00337] [00337] Figure 23A illustrates an aspect of a laser lighting system that has a plurality of laser beams that emit a plurality of wavelengths of electromagnetic energy. As can be seen in the figure, the lighting system 2700 can comprise a red laser beam 2720, a green laser beam 2730 and a blue laser beam 2740 which are all optically coupled together via optical fiber 2755. As can be seen seen in the figure, each of the laser beams can have a light detection element or corresponding electromagnetic sensor 2725, 2735, 2745, respectively, to detect the output of the specific laser beam or wavelength. [00338] [00338] Additional descriptions regarding the laser lighting system shown in Figure 23A for use in a 2108 surgical display system can be found in US patent application publication No. 2014/0268860, entitled CONTROLLING THE INTEGRAL LIGHT ENERGY OF A LASER PULSE , filed on March 15, 2014, which was granted on October 3, 2017 as US patent No. 9,777,913, the contents of which are incorporated herein by reference in their entirety and for all purposes. [00339] [00339] Figure 23B illustrates the operating cycles of a sensor used in the bearing reading mode. It will be understood that the x direction corresponds to time and that diagonal lines 2202 indicate the activity of an internal cursor that reads each frame of data, one line at a time. The same cursor is responsible for resetting each row of pixels for the next exposure period. The network integration time for each row 2219a to 2219c is equivalent, but they are misaligned in time with respect to each other due to the reset and the bearing reading process. Therefore, for any scenario in which adjacent frames are needed to represent different constitutions of light, the only option for having each row consistent is to pulse the light between reading cycles 2230a to 2230c. More specifically, the maximum available period corresponds to the sum of the suppression time plus any time during which the black or optically blind (OC) rows (2218, 2220) are reviewed at the beginning or at the end of the table. [00340] [00340] Figure 23B illustrates the operational cycles of a sensor used in the reading reading mode or during the reading of the 2200 sensor. The reading of the table can start on the vertical line 2210 and can be represented by it. The reading period is represented by the diagonal or slanted line 2202. The sensor can be read from row to row, with the top of the downward sloping edge being the top row of sensor 2212 and the bottom of the downward sloping edge being the row bottom of sensor 2214. The time between reading the last row and the next reading cycle can be called the suppression time 2216a to 2216d. It can be understood that the suppression time 2216a to 2216d can be the same between successful reading cycles or it can be different between successful reading cycles. It should be noted that some of the sensor's pixel rows could be covered with a light shield (for example, a metallic coating or any other substantially black layer of another type of material). These rows of covered pixels can be called black optical rows 2218 and [00341] [00341] “As shown in Figure 23B, these black optical rows 2218 and 2220 can be located at the top of the pixel matrix or at the bottom of the pixel matrix, or at the top and bottom of the pixel matrix. In some ways, it may be desirable to control the amount of electromagnetic radiation, for example, light, which is exposed to a pixel, [00342] [00342] “It should be noted that the condition of having a light pulse 2230a to 2230c to be read only in one frame and that does not interfere in the neighboring frames is to have the given light pulse 2230a to 2230c firing during the time of deletion 2216. Because the black optical rows 2218 and 2220 are insensitive to light, the frame time (m) of the rear black optical rows 2220 and the frame time (m + 1) of the front black optical rows 2218 can be added to suppression time 2216 to determine the maximum trigger time range of the 2230 light pulse. [00343] [00343] In some respects, Figure 23B shows an example of a timing diagram for sequential frame captures by a conventional CMOS sensor. Such a CMOS sensor can incorporate a standard of Bayer color filters, as shown in Figure 23C. It is recognized that the Bayer standard provides more details of luminance than chrominance. It can also be recognized that the sensor has a reduced spatial resolution, since a total of 4 adjacent pixels is required to produce the color information for the aggregated spatial portion of the image. In an alternative approach, the color image can be constructed by high-speed rapid stroboscopy of the visualized area with a variety of optical sources (laser diodes or light emitting diodes) with different central optical wavelengths. [003441] [003441] The optical strobe system may be under the control of the camera system, and may include a specially designed CMOS sensor for high-speed reading. The main benefit is that the sensor can achieve the same spatial resolution with significantly fewer pixels compared to conventional Bayer 3-sensor cameras. Therefore, the physical space occupied by the pixel array can be reduced. Actual pulse periods (2230a to 2230c) may differ within the repeating pattern, as shown in Figure 23B. This is useful, for example, to give more time to components that require the most light energy or those that have the weakest sources. As long as the average frame rate captured is an integer multiple of the final frame's indispensable frame rate, the data can simply be buffered in the signal processing chain as appropriate. [00345] [00345] The ease of reducing the area of the integrated circuit of the CMOS sensor to the extent allowed by the combination of all these methods is particularly attractive for a small diameter endoscopy (-3 to 10 mm). In particular, it allows for endoscope designs where the sensor is located at the distal end constricted in space, thereby greatly reducing the complexity and cost of the optical section, while providing high-definition video. The consequence of this approach is that reconstructing each final full color image requires data to be fused from three snapshots in time. Any movement within the scene, in relation to the optical frame of reference of the endoscope will generally degrade the perceived resolution, since the edges of the objects appear in slightly different locations within each captured component. This description describes a way to reduce this problem, which explores the fact that spatial resolution is much more important for luminance information than for chrominance. [00346] [00346] The basis of the approach is that, instead of firing monochromatic light during each frame, combinations of the three wavelengths are used to provide all the luminance information within a single image. Chrominance information is derived from separate frames with, for example, a repeat pattern like Y-Cb-Y-Cr (Figure 23D). Although it is possible to provide pure luminance data by an astute choice of pulse ratios, the same is not true in the case of chrominance. [00347] [00347] In one aspect, as illustrated in Figure 23D, an endoscopic system 2300a can comprise a pixel array 2302a with uniform pixels, and the 2300a system can be operated to receive Y pulses (luminance pulse) 2304a, Cb (ChromaBlue) 2306a and Cr (ChromaRed) 2308a. [00348] [00348] To complete a complete color image, it is necessary that the two chrominance components are also provided. However, the same algorithm that was applied for luminance cannot be applied directly to chrominance images once it is signed, as reflected in the fact that some of the RGB coefficients are negative. The solution to this is to add a degree of luminance of sufficient magnitude so that all final pulse energies become positive. As long as the color fusion process at the ISP is aware of the composition of the chrominance frames, they can be decoded by subtracting the appropriate amount of luminance from a neighboring frame. The proportions of pulse energy are given by: Y = 0.183-R + 0.614-G + 0.062-B Cb = AY - 0.101-R - 0.339-G + 0.439-B Cr = ô: Y + 0.439: R - 0.399: G - 0.040: B where 1 2 0.399 / 0.614 = 0.552 3> 0.399 / 0.614 = 0.650 [00349] [00349] It appears that, if the factor A is equal to 0.552, the red and green components are exactly canceled, in which case the Cb information can be provided with pure blue light. Similarly, the setting at = 0.650 cancels the blue and green components for Cr, which becomes pure red. This specific example is illustrated in Figure 23E, which also represents À and à as integers multiple of 1 / 2º. This is a convenient approach for reconstructing digital pictures. [00350] [00350] In the case of the Y-Cb-Y-Cr pulse scheme, the image data is already in the YCbCr space after the color fusion. So in this case, it makes sense to perform luminance and chrominance based on the initial operations, before converting back to linear RGB to perform color correction, etc. [00351] [00351] The color fusion process is simpler than the "demosaicing" (color interpolation), which is required by the Bayer standard (see Figure 23C), since there is no spatial interpolation. This will require the buffering of frames in order to have all the necessary information available for each pixel. In general, data from the Y-Cb-Y-Cr pattern can be channeled to produce a complete color image by two captured raw images. This is accomplished by using each chrominance sample twice. Figure 23F shows the specific example of a 120 Hz frame rate providing a final 60 Hz video. [00352] [00352] “Additional descriptions regarding the control of the laser components of a lighting system, as represented in Figures 23B to 23F, for use in a surgical visualization system 108 can be found in US patent application publication No. 2014/0160318 , entitled YCBCR PULSED ILLUMINATION SCHEME IN A LIGHT DEFICIENT ENVIRONMENT, filed on July 26, 2013, which was granted on December 6, 2016 as US patent No. 9,516,239, and publication of US patent application No. 2014/0160319, entitled CONTINUOUS VIDEO IN A LIGHT DEFICIENT ENVIRONMENT, filed on July 26, 2013, which was granted on August 22, [00353] [00353] “During a surgical procedure, a surgeon may have to manipulate tissues to achieve a desired medical result. The surgeon's actions are limited by what is visually observable at the surgical site. Thus, the surgeon may not be aware, for example, of the arrangement of vascular structures under the tissues being manipulated during the procedure. [00354] [00354] Since the surgeon is unable to view the vasculature under a surgical site, the surgeon may accidentally cut one or more blood vessels during the procedure. [00355] [00355] Therefore, it is desirable to have a surgical visualization system that can capture imaging data from the surgical site for presentation to a surgeon, and the presentation may include information related to the presence of vascular structures located under the surface of a surgical site. [00356] [00356] Some aspects of the present description also provide a control circuit configured to control the lighting of a surgical site with the use of one or more light sources, such as laser light sources, and to receive imaging data from one or more sensors of image. In some respects, the present description provides a non-transitory, computer-readable medium that stores computer-readable instructions that, when executed, cause a device to detect a blood vessel in a tissue and determine its depth below the tissue surface. [00357] [00357] In some aspects, a surgical imaging system may include a plurality of light sources, with each light source being configured to emit a light that has a specified central wavelength, a light sensor configured for receiving a portion of the reflected light from a tissue sample when illuminated by one or more of the plurality of lighting sources and a computer system. The computer system can be configured to: receive data from the light sensor when the tissue sample is illuminated by each of the plurality of light sources; determine a depth location of a structure within the tissue sample based on the data received by the light sensor when the tissue sample is illuminated by each of the plurality of light sources and calculate the visualization data for the structure and the depth of the structure. In some respects, the visualization data may have a data format that can be used by a display system, and the structure may comprise one or more vascular tissues. Acquisition of vascular images using NIR spectroscopy [00358] [00358] In one aspect, a surgical image capture system may include a cascade of colors independent of lighting sources that comprises visible light and light outside the visible range to make images of one or more tissues within a surgical site at different times and at different depths. The system for capturing surgical images can also detect or calculate characteristics of reflected and / or refracted light from the surgical site. The characteristics of the light can be used to provide a composite image of the tissue within the surgical site and also to provide an analysis of underlying tissue not directly visible on the surface of the surgical site. The surgical imaging system can determine the location of tissue depth without the need for separate measuring devices. [00359] [00359] In one aspect, the characteristic of reflected and / or refracted light from the surgical site can be an amount of light absorbance at one or more wavelengths. Various chemical components of individual tissues can result in specific patterns of light absorption that are dependent on wavelength. [00360] [00360] In one aspect, the lighting sources may comprise a red laser source and a near infrared laser source, the one or more tissues to be imaged may include vascular tissue, such as veins or arteries. In some respects, red laser sources (in the visible range) can be used to capture images of some aspects of underlying vascular tissue based on spectroscopy in the visible red range. In some non-limiting examples, a red laser light source can generate illumination that has a peak wavelength that can vary between 635 nm and 660 nm, inclusive. Non-limiting examples of a red laser peak wavelength may include about 635 nm, about 640 nm, about 645 nm, about 650 nm, about 655 nm, about 660 nm or any value or range of values between them. In some other respects, near-infrared laser sources can be used to capture images of underlying vascular tissue based on near-infrared spectroscopy. In some non-limiting examples, a near-infrared laser source can emit illumination with a wavelength that can vary between 750 and [00361] [00361] Near infrared spectroscopy ("NIRS" - near infrared spectroscopy) is a non-invasive technique that allows the determination of tissue oxygenation based on the spectrophotometric quantification of oxyhemoglobin and deoxyhemoglobin inside a tissue. In some respects, NIRS can be used to capture images of vascular tissue directly based on the difference in the absorbance of illumination between the vascular tissue and the non-vascular tissue. Alternatively, the vascular tissue can be indirectly visualized based on a difference in lighting absorbance of blood flow in the tissue before and after the application of physiological interventions, such as arterial and venous occlusion methods. [00362] [00362] Instrumentation for near IR spectroscopy (NIR) can be similar to instruments for the visible UV and medium IR ranges. Such spectroscopic instruments can include a light source, a detector and a dispersive element to select a specific near-IR wavelength to illuminate the tissue sample. In some ways, the source may comprise an incandescent light source or a halogen-quartz light source. In some ways, the detector may comprise a semiconductor photodiode (for example, an InGaAs) or a photo array. In some respects, the dispersive element may comprise a prism or, more commonly, a diffraction lattice. Fourier transform NIR instruments that use an interferometer are also common, especially for wavelengths greater than about 1,000 nm. Depending on the sample, the spectrum can be measured in any reflection or transmission mode. [00363] [00363] Figure 24 schematically represents an example of 2400 instrumentation similar to instruments for the visible UV and medium IR ranges for NIR spectroscopy. A light source 2402 can emit a wide spectral range of illumination 2404 that can fall on a dispersive element 2406 (such as a prism or diffraction lattice). The dispersive element 2406 can operate to select a narrow wavelength portion 2408 of the light emitted by the broad spectrum light source 2402, and the selected portion 2408 of the light can illuminate the fabric 2410. The light reflected from the fabric 2412 can be directed to a detector 2416 (for example, through a dichroic mirror 2414) and the intensity of the reflected light 2412 can be recorded. The wavelength of the light that illuminates the fabric 2410 can be selected by the dispersive element 2406. In some aspects, the fabric 2410 can be illuminated only by a single portion of narrow wavelength 2408 selected by the dispersive element 2406 to form the source of light 2402. In other respects, fabric 2410 can be scanned with a variety of portions of narrow wavelength 2408 through the selected dispersive element 2406. In this way, a spectroscopic analysis of the 2410 tissue can be obtained over a range of NIR wavelengths. [00364] [00364] Figure 25 schematically represents an example of 2430 instrumentation to determine NIRS based on Fourier transform infrared imaging. In Figure 25, a laser source that emits 2432 light in the near IR range 2434 illuminates a tissue sample 2440. The light reflected 2436 by tissue 2440 is reflected 2442 by a mirror, such as a dichroic mirror 2444, for a dichroic beam 2446 The dichroic beam 2446 directs a portion of the light 2448 reflected by the fabric 2440 to a stationary mirror 2450 and a portion of the light 2452 reflected by the fabric 2440 to a mirror in motion 2454. The mirror in motion [00365] [00365] An alternative to near infrared light to determine hemoglobin oxygenation would be the use of monochromatic red light to determine the absorbance characteristics of hemoglobin red light. The characteristics of red light absorbance with a central wavelength of about 660 nm by hemoglobin can indicate whether hemoglobin is oxygenated (arterial blood) or deoxygenated (venous blood). [00366] [00366] In some alternative surgical procedures, contrast agents can be used to improve the data that is collected on oxygenation and oxygen consumption by the tissue. In a non-limiting example, NIRS techniques can be used in conjunction with a bolus injection of a nearby IV contrast agent such as indocyanine green ("ICG" - indocyanine green) that has a peak absorbance at about 800 nm. ICG has been used in some medical procedures to measure cerebral blood flow. Acquisition of vascular images using laser Doppler flowmetry [00367] [00367] In one aspect, the characteristic of reflected and / or refracted light from the surgical site can be a Doppler effect of the wavelength of light from its light source. [00368] [00368] Laser Doppler flowmetry can be used to visualize and characterize a flow of particles that move in relation to an effectively stationary background. In this way, the laser light scattered by moving particles, such as blood cells, may have a different wavelength than the original laser light source. In contrast, the laser light scattered over the effectively stationary background (for example, vascular tissue) may have a wavelength equal to that of the original laser light source. The change in the wavelength of the light spread from the blood cells can reflect both the direction of blood cell flow relative to the laser source and the speed of the blood cell. Figures 26A to 26C illustrate the change in the wavelength of light scattered from blood cells that may be moving away from (Figure 26A) or towards (Figure 26C) the laser light source. [00369] [00369] In each of Figures 26A to 26C, the original illumination light 2502 is represented with a relative central wavelength of 0. It can be seen from Figure 26A that light scattered from moving blood cells in the opposite direction to the laser source 2504 it has a wavelength shifted by a certain amount 2506 to a wavelength greater than that of the laser source (and is therefore redshifted). It can also be seen from Figure 26C that the light scattered from blood cells moving towards the laser source 2508 has a wavelength shifted by a certain amount 2510 to a shorter wavelength compared to that of the source of the laser. laser (and is therefore blue shifted). The amount of displacement of the wavelength (for example, 2506 or 2510) may depend on the speed of movement of the blood cells. In some respects, the amount of redshift 2506 of some blood cells may be approximately the same amount of blue shift 2510 of some other blood cells. Alternatively, the amount of redshift 2506 of some blood cells may differ from the amount of blue shift 2510 of some other blood cells. Thus, the speed of blood cells flowing in the opposite direction to the laser source, as shown in Figure 26A, may be less than the speed of blood cells flowing towards the laser source, as shown in Figure 26C, based on in the relative magnitude of the wavelength shifts 2506 and 2510. In contrast, and as depicted in Figure 26B, light scattered from tissue that does not move in relation to the laser light source (for example, 2512 blood vessels or tissue non-vascular 2514) may not show any change in wavelength. [00370] [00370] Figure 27 represents an aspect of the 2530 instrumentation that can be used to detect a Doppler effect in the laser light scattered from portions of a 2540 tissue. The 2534 light from a 2532 laser can pass through a dichroic beam [00371] [00371] It can be recognized that the 2542 backscattered light from the 2540 fabric may also include backscattered light from the boundary layers within the 2540 fabric and / or specific wavelength light absorption by the material within the fabric [00372] [00372] Figure 28 shows some of these additional optical effects. It is well known that the light that travels through a first optical medium that has a first refractive index, n1, can be reflected in an interface with a second optical medium that has a second refractive index, n2. The light transmitted through the second optical medium will have a transmission angle in relation to the interface that differs from the angle of the incident light based on a difference between the refractive indices n1 and n2 (Snell's law). Figure 28 illustrates the effect of Snell's law on light striking the surface of a multi-component tissue [00373] [00373] An incident light 2170a can be used to probe blood vessel 2160 and can be directed on the top surface 2154 of the outer tissue layer 2152. A portion 2172 of incident laser light 2170a can be reflected on the top surface 2154. Another portion 2170b of the incident laser light 2170a can penetrate the outer tissue layer 2152. The reflected portion 2172 on the top surface 2154 of the external tissue layer 2152 has the same path length as the incident light 2170a and therefore has the same wavelength and phase of incident light 2170a. However, the portion 2170b of light transmitted into the outer tissue layer 2152 will have a transmission angle that differs from the angle of incidence of light that falls on the tissue surface due to the fact that the outer tissue layer 2152 has an index refractive index n1 which differs from the air refractive index. [00374] [00374] If the portion of light transmitted through the outer tissue layer 2152 falls on a second tissue surface 2158, for example, from the blood vessel wall 2156, a certain portion 2174a, b of light will be reflected back to the incident light source 2170a. The light reflected in this way 2174a at the interface between the outer tissue layer 2152 and the blood vessel wall 2156 will have the same wavelength as the incident light 2170a, but will undergo phase shift due to the change in the length of the light path. Projecting the reflected light 2174a, b from the interface between the outer tissue layer 2152 and the blood vessel wall 2156 together with the light incident on the sensor will produce an interference pattern based on the phase difference between the two light sources. [00375] [00375] In addition, a portion of incident light 2170c can be transmitted through the wall of blood vessel 2156 and penetrate the lumen of blood vessel 2160. That portion of incident light 2170c can interact with blood cells moving in the lumen of the blood vessel 2160 and can be reflected back from 2176a to 2176c towards the source of the incident light which has a wavelength that has undergone the Doppler effect according to the speed of the blood cells, as shown above. The reflected light that has passed through the Doppler 2176a to 2176c effect from the moving blood cells can be projected together with the light incident on the sensor, resulting in an interference pattern that has a border pattern based on the difference in wavelength between the two light sources. [00376] [00376] In Figure 28, a light path 2178 of light is shown that falls on the erythrocytes in the lumen of blood vessel 2160 if there is no change in the refractive index between the light emitted and the light reflected by the moving blood cells. In this example, only a Doppler effect on the wavelength of the reflected light can be detected. However, the light reflected by the blood cells (2176a to 2176c) can incorporate phase changes due to the variation in the refractive indexes of the tissue in addition to changes in the wavelength due to the Doppler effect. [00377] [00377] Thus, it can be understood that, if the light sensor receives the incident light, the reflected light from one or more fabric interfaces (2172 and 2174a, b) and the light that has suffered Doppler effect from the blood cells (2176a to 2176c), the interference pattern produced, thus, in the light sensor can include the effects due to the Doppler effect (change in the wavelength) as well as the effects due to the change in the refractive index within the tissue ( phase variation). As a result, a Doppler analysis of the light reflected by the tissue sample can produce erroneous results if the effects due to changes in the refractive index within the sample are not compensated. [00378] [00378] Figure 29 illustrates an example of the effects on a Doppler analysis of light falling on a tissue sample 2250 to determine the depth and location of an underlying blood vessel. If there is no intervening tissue between the blood vessel and the tissue surface, the interference pattern detected in the sensor may be due mainly to the change in the wavelength reflected from the moving blood cells. As a result, a 2252 spectrum derived from the interference pattern can generally reflect only the Doppler effect of blood cells. However, if there is intervening tissue between the blood vessel and the tissue surface, the interference pattern detected in the sensor may be due to a combination of the change in the wavelength reflected from the moving blood cells and the phase shift due to refractive index of the intervening tissue. A spectrum 2254 derived from such an interference pattern can result in the calculation of the Doppler effect that is confused due to the additional phase shift in the reflected light. In some respects, if information regarding the characteristics (thickness and refractive index) of the intervening tissue is known, the resulting spectrum 2256 can be corrected to provide a more accurate calculation of the change in wavelength. [00379] [00379] It is recognized that the depth of light penetration into the fabric depends on the wavelength of the light used. In this way, the wavelength of light from the laser source can be chosen to detect particle movement (such as blood cells) in a specific range of tissue depth. Figures 30A to 30C schematically represent a means for detecting moving particles, such as blood cells, at a variety of tissue depths based on the wavelength of the laser light. As illustrated in Figure 30A, a laser source 2340 can direct an incident beam of laser light 2342 to a surface 2344 of a surgical site. A 2346 blood vessel (such as a vein or artery) can be disposed within the 2348 tissue at a certain depth ô from the tissue surface. The 2350 depth of penetration of a laser into a 2348 fabric may depend at least in part on the laser wavelength. In this way, laser light that has a wavelength in the red range of about 635 nm to about 660 nm can penetrate 2351a tissue at a depth of about 1 mm. Laser light that has a wavelength in the green range of about 520 nm to about 532 nm can penetrate 2351b tissue at a depth of about 2 to 3 mm. Laser light that has a wavelength in the blue range of about 405 nm to about 445 nm can penetrate 2351c tissue to a depth of about 4 mm or more. In the example shown in Figures 30A to 30C, a 2346 blood vessel can be located at a depth 5 of about 2 to 3 mm below the tissue surface. The red laser light will not penetrate to this depth and, therefore, will not detect blood cells that flow inside this vessel. However, both green and blue laser lights can penetrate to that depth. Therefore, green and blue laser light scattered from blood cells within blood vessel 2346 can demonstrate a Doppler effect on the wavelength. [00380] [00380] Figure 30B illustrates how a Doppler 2355 effect on the wavelength of the reflected laser light can appear. The emitted light (or light from the 2342 laser source) that strikes a tissue surface 2344 may have a central wavelength 2352. For example, light from a green laser may have a central wavelength 2352 within a range from about 520 nm to about 532 nm. The reflected green light can have a central wavelength 2354 shifted to a longer wavelength (shifted to red) if the light has been reflected from a particle it was like an erythrocyte moving in the opposite direction to the detector. The difference between the central wavelength 2352 of the emitted laser light and the central wavelength 2354 of the emitted laser light comprises the Doppler effect 2355. [00381] [00381] “As shown above with respect to Figures 28 and 29, laser light reflected from structures within a 2348 fabric may also show a phase shift in the reflected light due to changes in the refractive index resulting from changes in structure or composition of the tissue. The emitted light (or light from the 2342 laser source) that strikes a fabric surface 2344 may have a first 2356 phase characteristic. The reflected laser light may have a second 2358 phase characteristic. It can be recognized that the blue laser light which can penetrate tissue to a depth of about 4 mm or more 2351c can find a greater variety of tissue structures than red laser light (about 1 mm 2351a) or green laser light (about 2 to 3 mm 2351b). Consequently, as illustrated in Figure 30C, the 2358 phase shift of the reflected blue laser light can be significant at least due to the depth of penetration. [00382] [00382] Figure 30D illustrates aspects of tissue illumination by red laser light 2360a, green 2360b and blue 2360c in a sequential manner. In some respects, a fabric can be scanned by red laser lighting 2360a, green 2360b and blue 2360c sequentially. In some alternative examples, one or more combinations of red laser light 2360a, green 2360b and blue 2360c, as shown in Figures 23D to 23F and described above, can be used to illuminate the fabric according to a defined lighting sequence. Figure 30D illustrates the effect of such lighting on a CMOS imaging sensor 2362a through 2362d over time. Thus, in a first time t1, the CMOS sensor 2362a can be illuminated by the red laser 2360a. In a second time t2, the CMOS sensor 2362b can be illuminated by the green laser 2360b. In a third time t3, the CMOS sensor 2362a can be illuminated by the blue laser 2360a. The illumination cycle can then be repeated starting at a quarter of the time at which the CMOS sensor 2362d can be illuminated by the red laser 2360a again. It can be recognized that sequential illumination of the tissue by laser illumination at different wavelengths can allow Doppler analysis at different depths of tissue over time. Although red laser sources 2360a, green 2360b and blue 2360c can be used to illuminate the surgical site, it can be recognized that other wavelengths outside visible light (such as in the infrared or ultraviolet regions) can be used to illuminate the surgical site. for Doppler analysis. [00383] [00383] Figure 31 illustrates an example of a use of Doppler imaging to detect the presence of blood vessels that are not otherwise visible at a 2600 surgical site. In Figure 31, a surgeon may wish to remove a 2602 tumor found in the lobe right posterior posterior 2604 of a lung. Due to the fact that the lungs are highly vascular, care must be taken to identify only the blood vessels associated with the tumor and to seal only those vessels without compromising blood flow to the unaffected portions of the lung. In Figure 31, the surgeon identified the 2606 margin of tumor 2604. The surgeon can then cut an initial dissected area 2608 on the edge of region 2606 and the exposed blood vessels 2610 can be seen for cutting and sealing. The Doppler 2620 imaging detector can be used to locate and identify unobservable 2612 blood vessels in the dissected area. An imaging system can receive data from the Doppler 2620 imaging detector for analysis and visualization of data obtained from the surgical site 2600. In some respects, the imaging system may include a screen to illustrate the surgical site 2600 that includes an image visible from surgical site 2600 along with an overlay image of blood vessels 2612 hidden in the image from surgical site 2600. [00384] [00384] In the scenario presented above in relation to Figure 31, a surgeon wants to cut blood vessels that supply oxygen and nutrients to a tumor while sparing blood vessels associated with cancerous non-tissue. In addition, blood vessels can be arranged at different depths at or around the 2600 surgical site. The surgeon must therefore identify the position (depth) of the blood vessels and determine whether they are suitable for resection. Figure 32 illustrates a method for identifying deep blood vessels based on a Doppler effect of light from blood cells flowing through them. As described above, the red laser light has a penetration depth of about 1 mm and the green laser light has a penetration depth of about 2 to 3 mm. However, a blood vessel that has a surface depth below 4 mm or more will be outside the penetration depths at these wavelengths. Blue laser light, however, can detect such blood vessels based on their blood flow. [00385] [00385] Figure 32 represents the Doppler effect of laser light reflected from a blood vessel at a specific depth below a surgical site. The site can be illuminated by red laser light, green laser light and blue laser light. The central wavelength 2630 of the illumination light can be normalized to a relative center 3631. If the blood vessel resides at a depth of 4 or more mm below the surface of the surgical site, neither the red laser light nor the green laser light will be reflected by the blood vessel. Consequently, the central wavelength 2632 of reflected red light and the central wavelength 2634 of reflected green light will not be much different from the central wavelength 2630 of red light or green illumination light, respectively. However, if the site is illuminated by blue laser light, the central wavelength 2638 of reflected blue light 2636 will be different from the central wavelength 2630 of blue illuminator light. In some cases, the amplitude of the reflected blue light 2636 can also be significantly reduced from the amplitude of the blue illumination light. A surgeon can then determine the presence of a deep blood vessel along with its approximate depth, thereby avoiding the deep blood vessel during dissection of the surface tissue. [00386] [00386] Figures 33 and 34 schematically illustrate the use of laser sources with different central wavelengths (colors) to determine the approximate depth of a blood vessel under the surface of a surgical site. Figure 33 represents a first surgical site 2650 that has a surface 2654 and a blood vessel 2656 disposed below the surface 2654. In one method, blood vessel 2656 can be identified based on a Doppler effect of the light that affects the flow 2658 of blood cells within the blood vessel [00387] [00387] In contrast to blood vessel 2656 shown in Figure 33, blood vessel 2656 'shown in Figure 34 is located closest to the tissue surface at the surgical site. Blood vessel 2656 'can also be distinguished from blood vessel 2656 in the sense that blood vessel 2656' is illustrated as having a much thicker wall 2657. Thus, blood vessel 2656 'may be an example of an artery, while the blood vessel 2656 may be an example of a vein because the arterial walls are known to be thicker than the venous walls. In some instances, the arterial walls may be about 1.3 mm thick. As described above, the illumination of the 2670 'red laser can penetrate the fabric to a depth of about 1 mm 2672'. That way, even if a blood vessel [00388] [00388] As shown above, the depth of blood vessels below the surgical site can be probed based on Doppler imaging depending on the wavelength. The amount of blood flow through such a blood vessel can also be determined by speckle contrast (interference) analysis. The Doppler effect can indicate a particle moving in relation to a stationary light source. As described above, the Doppler wavelength shift can be an indication of the speed of the particle's movement. Individual particles such as blood cells may not be observable separately. However, the speed of each blood cell will produce a proportional Doppler effect ... An interference pattern can be generated by combining the backscattered light from multiple blood cells due to differences in the Doppler effect of the backscattered light from each of the blood cells. The interference pattern can be an indication of the numerical density of blood cells within a display frame. The interference pattern can be called speckle contrast. Speckle contrast analysis can be calculated using a 300 x 300 full-frame CMOS imaging matrix, and speckle contrast can be directly related to the amount of moving particles (eg, blood cells) interacting with laser light over a given exposure period. [00389] [00389] A CMOS image sensor can be coupled to a digital signal processor (PSD). Each pixel of the sensor can be multiplexed and digitized. The Doppler effect on light can be analyzed by looking at the laser light source compared to the light that has undergone the Doppler effect. A larger Doppler effect and speckle may be related to a greater number of blood cells and their velocity in the blood vessel. [00390] [00390] Figure 35 represents an aspect of a 2800 composite visual display that can be presented to a surgeon during a surgical procedure. The composite visual display 2800 can be constructed by superimposing a white light image 2830 from the surgical site with an image from Doppler analysis 2850. [00391] [00391] In some respects, the white light image 2830 can depict surgical site 2832, one or more surgical incisions 2834 and tissue 2836 readily visible within the surgical incision 2834. The white light image 2830 can be generated through illumination 2840 of surgical site 2832 with a white light source 2838 and reception of reflected white light 2842 by an optical detector. Although a 2838 white light source can be used to illuminate the surgical site surface, in one aspect, the surgical site surface can be visualized using suitable combinations of red 2854, green 2856 and blue 2858 laser light, as described above with respect to Figures 23C to 23F. [00392] [00392] In some respects, the 2850 Doppler analysis image may include information about the blood vessel depth along with information about the 2852 blood flow (from the speckle analysis). As described above, blood vessel depth and blood flow velocity can be obtained by illuminating the surgical site with multi-wavelength laser light and by determining the depth of the blood vessel and blood flow based on the depth of known penetration of light of a specific wavelength. In general, the surgical site 2832 can be illuminated by the light emitted by one or more lasers such as a red laser 2854, a green laser 2856 and a blue laser 2858. A CMOS detector 2872 can receive reflected light back (2862, 2866 , 2870) from surgical site 2832 and its surrounding tissue. The 2850 Doppler analysis image can be constructed 2874 based on an analysis of the multiple pixel data of the CMOS detector 2872. [00393] [00393] In one aspect, a red laser 2854 can emit red laser light 2860 on surgical site 2832 and reflected light 2862 can reveal minimal surface or subsurface structures. In one aspect, a green laser 2856 can emit green laser illumination 2864 over surgical site 2832 and reflected light 2866 can reveal characteristics of a deeper subsurface. In one aspect, a blue laser 2858 can emit blue laser lighting 2868 over surgical site 2832 and reflected light 2870 can reveal, for example, a blood flow within the deepest vascular structures. In addition, speckle contrast analysis can provide the surgeon with information regarding the amount and speed of blood flow through the deepest vascular structures. [00394] [00394] Although not shown in Figure 35, it can be understood that the imaging system can also illuminate the surgical site with light outside the visible range. This light can include infrared light and ultraviolet light. In some respects, sources of infrared or ultraviolet light may include broadband wavelength sources (such as a tungsten source, a tungsten-halogen source or a deuterium source). In some other respects, sources of infrared or ultraviolet light may include narrow-band wavelength sources (diode | V lasers, UV gas lasers or dye lasers). [00395] [00395] Figure 36 is a 2900 flow chart of a method for determining a depth of a surface feature on a piece of fabric. An image capture system can illuminate 2910 a fabric with a first beam of light having a first center frequency and receive 2912 a first reflected light from the fabric illuminated by the first beam of light. The image capture system can then calculate 2914 a first Doppler effect based on the first beam of light and the first reflected light. The image capture system can then illuminate the fabric 2916 with a second beam of light having a second central frequency and receive 2918 a second light reflected from the fabric illuminated by the first beam of light. The image capture system can then calculate a second 2920 Doppler effect based on the second beam of light and the second reflected light. The image capture system can then calculate 2922 a depth of a tissue resource based in part on the first central wavelength, the first Doppler effect, the second central wavelength and the second Doppler effect. In some respects, tissue resources can include the presence of moving particles, such as blood cells moving within a blood vessel, and a direction and speed of the flow of moving particles. It can be understood that the method can be extended to include lighting the fabric by any one or more additional light beams. In addition, the system can calculate an image that comprises a combination of an image of the fabric surface and an image of the structure arranged within the fabric. [00396] [00396] In some ways, multiple visual displays can be used. For example, a 3D screen can provide a composite image that displays the combined white light (or a suitable combination of red, green and blue laser light) and laser Doppler imaging. Additional screens may provide only the white light display or a screen showing a composite white light display and a NIRS display to view only the tissue's blood oxygen response. However, displaying the NIRS may not be necessary at each cycle which allows for a tissue response. Characterization of subsurface tissue using multispectral TCO [00397] [00397] During a surgical procedure, the surgeon may employ "smart" surgical devices for manipulating tissue. Such devices can be considered "intelligent" in the sense that they include resources to direct, control and / or vary the actions of the devices based on parameters relevant to their uses. The parameters can include the type and / or composition of the fabric being handled. If the type and / or composition of the tissue being manipulated are unknown, the actions of the smart devices may be inappropriate for the tissue being manipulated. As a result, tissues may be damaged or tissue manipulation may be ineffective due to improper settings of the smart device. [00398] [00398] The surgeon may attempt to manually vary the parameters of the smart device in a trial and error manner, resulting in an inefficient and time-consuming surgical procedure. [00399] [00399] Therefore, it is desirable to have a surgical visualization system that can probe tissue structures underlying a surgical site to determine its structural and compositional characteristics and provide such data to the smart surgical instruments that are used in a surgical procedure. [00400] [00400] Some aspects of the present description also provide a control circuit configured to control the lighting of a surgical site with the use of one or more light sources, such as laser light sources, and to receive imaging data from one or more sensors of image. In some respects, the present description provides a non-transitory, computer-readable medium that stores computer-readable instructions that, when executed, cause a device to characterize structures below the surface in a surgical site and determine the depth of the structures below the surface of the fabric. [00401] [00401] In some respects, a surgical imaging system can comprise a plurality of light sources, with each light source being configured to emit a light that has a specified central wavelength, a light sensor configured for receiving a portion of the reflected light from a tissue sample when illuminated by one or more of the plurality of lighting sources and a computer system. The computing system can be configured to receive data from the light sensor when the tissue sample is illuminated by each of the plurality of light sources, calculate the structural data related to a characteristic of a structure within the tissue sample based on in the data received by the light sensor when the tissue sample is illuminated by each of the lighting sources and transmit the structural data related to the characteristic of the structure to be received by the intelligent surgical device. In some respects, the feature of the structure is a surface feature or a structure composition. [00402] [00402] In one aspect, a surgical system can include multiple sources of laser light and can receive laser light reflected from a tissue. The light reflected from the fabric can be used by the system to calculate the surface characteristics of the components arranged within the fabric. The characteristics of the components arranged within the fabric can include a composition of the components and / or a metric related to surface irregularities of the components. [00403] [00403] In one aspect, the surgical system can transmit data related to the composition of the components and / or metrics related to surface irregularities of the components to a second instrument to be used in the tissue to modify the control parameters of the second instrument. [00404] [00404] In some respects, the second device may be an advanced energy device, and modifications to the control parameters may include a clamp pressure, an operating power level, an operating frequency and a transducer signal amplitude. [00405] [00405] “As described above, blood vessels can be detected under the surface of a surgical site based on the Doppler effect in the light reflected by the blood cells that move within the blood vessels. [00406] [00406] Laser Doppler flowmetry can be used to visualize and characterize a flow of particles that move in relation to an effectively stationary background. In this way, laser light scattered by moving particles, such as blood cells, may have a different wavelength than that of the original laser source of illumination. In contrast, the laser light scattered over the effectively stationary background (for example, vascular tissue) may have a wavelength equal to that of the original laser light source. The change in the wavelength of the light spread from the blood cells can reflect both the direction of blood cell flow relative to the laser source and the speed of the blood cell. As previously described, Figures 26A to 26C illustrate the change in wavelength of light scattered from blood cells that may be moving away from (Figure 26A) or towards (Figure 26C) the laser light source. [00407] [00407] In each of Figures 26A to 26C, the original illumination light 2502 is represented with a relative central wavelength of 0. It can be seen from Figure 26A that the light spread from moving blood cells in the opposite direction to the laser source 2504 it has a wavelength shifted by a certain amount 2506 to a wavelength greater than that of the laser source (and is therefore redshifted). It can also be seen from Figure 24C that the light scattered from blood cells moving towards the laser source 2508 has a wavelength shifted by a certain amount 2510 to a shorter wavelength in relation to the source laser (and is therefore blue shifted). The amount of wavelength shift (for example, 2506 or 2510) may depend on the rate at which blood cells move. In some respects, the amount of redshift (2506) of some blood cells can be approximately the same amount of blue shift (2510) of some other blood cells. Alternatively, the amount of redshift (2506) of some blood cells may differ from the amount of blue shift (2510) of some other blood cells. Thus, the speed of blood cells flowing in the opposite direction to the laser source, as shown in Figure 24A, may be less than the speed of blood cells flowing towards the laser source, as shown in Figure 26C, based on in the relative magnitude of wavelength shifts (2506 and 2510). In contrast, and as shown in Figure 26B, light scattered from tissue that does not move in relation to the laser light source (for example, blood vessels 2512 or non-vascular tissue 2514) may not demonstrate any change in wavelength . [00408] [00408] “As previously described, Figure 27 represents an aspect of the 2530 instrumentation that can be used to detect a Doppler effect in laser light scattered from portions of a tissue [00409] [00409] It can be recognized that the 2542 backscattered light from the 2540 fabric can also include backscattered light from the boundary layers within the 2540 fabric and / or specific wavelength light absorption by the material within the 2540 fabric. As a result, the interference pattern observed in the 2550 detector can incorporate interference edge features of these additional optical effects and can therefore confuse the calculation of the Doppler effect if not properly analyzed. [00410] [00410] It can be recognized that the light reflected from the fabric may also include back-scattered light from the boundary layers within the fabric and / or absorption of specific wavelength light by the material within the fabric. As a result, the interference pattern observed in the detector can incorporate edge features that can confuse the calculation of the Doppler effect if not properly analyzed. [00411] [00411] As previously described, Figure 28 shows some of these additional optical effects. It is well known that the light that travels through a first optical medium that has a first refractive index, n1, can be reflected in an interface with a second optical medium that has a second refractive index, n2. The light transmitted through the second optical medium will have a transmission angle in relation to the interface that differs from the angle of the incident light based on a difference between the refractive indices n1 and n2 (Snell's law). Figure 26 illustrates the effect of Snell's law on light that strikes the surface of a 2150 multi-component tissue, as can be seen in a surgical field. The multi-component tissue 2150 may consist of an outer tissue layer 2152 that has a refractive index n1 and a buried tissue, such as a blood vessel that has a vessel wall 2156. The vessel wall 2156 can be characterized by a refractive index n2. Blood can flow into the lumen of the blood vessel 2160. In some respects, it may be important during a surgical procedure to determine the position of the blood vessel 2160 below the surface 2154 of the outer tissue layer [00412] [00412] “A 2170a incident laser light can be used to probe the blood vessel 2160 and can be directed at the top surface 2154 of the outer tissue layer 2152. A portion 2172 of the incident laser light 2170a can be reflected on the top surface 2154. Another portion 2170b of incident laser light 2170a can penetrate the outer tissue layer 2152. The reflected portion 2172 on the top surface 2154 of the outer tissue layer 2152 has the same path length as the incident light 2170a and therefore has the same wavelength and phase of incident light 2170a. However, the portion 2170b of light transmitted into the outer tissue layer 2152 will have a transmission angle that differs from the angle of incidence of light that falls on the tissue surface due to the fact that the outer tissue layer 2152 has an index refractive index n1 which differs from the air refractive index. [00413] [00413] If the portion of light transmitted through the outer tissue layer 2152 falls on a second surface of tissue 2158, for example, from the blood vessel wall 2156, a certain portion 2174a, b of light will be reflected back to the incident light source 2170a. The light reflected in this way 2174a at the interface between the outer tissue layer 2152 and the blood vessel wall 2156 will have the same wavelength as the incident light 2170a, but will undergo phase shift due to the change in the length of the light path. Projecting the reflected light 2174a, b from the interface between the outer tissue layer 2152 and the blood vessel wall 2156 together with the light incident on the sensor will produce an interference pattern based on the phase difference between the two light sources. [00414] [00414] In addition, a portion of incident light 2170c can be transmitted through the wall of blood vessel 2156 and penetrate the lumen of blood vessel 2160. That portion of incident light 2170c can interact with blood cells moving in the lumen of the blood vessel 2160 and can be reflected back from 2176a to 2176c towards the source of the incident light which has a wavelength that has undergone the Doppler effect according to the speed of the blood cells, as shown above. The reflected light that has passed through the Doppler 2176a to 2176c effect from the moving blood cells can be projected together with the light incident on the sensor, resulting in an interference pattern that has a border pattern based on the difference in wavelength between the two light sources. [00415] [00415] In Figure 28, a light path 2178 of light is shown that falls on the erythrocytes in the lumen of the blood vessel 2160 if there is no change in the refractive index between the light emitted and the light reflected by the moving blood cells. In this example, only a Doppler effect on the wavelength of the reflected light can be detected. However, the light reflected by the blood cells (2176a to 2176c) can incorporate phase changes due to the variation in the refractive indexes of the tissue in addition to changes in the wavelength due to the Doppler effect. [00416] [00416] Thus, it can be understood that, if the light sensor receives the incident light, the reflected light from one or more fabric interfaces (2172 and 2174a, b) and the light that has suffered Doppler effect from the blood cells (2176a to 2176c), the interference pattern produced, thus, in the light sensor can include the effects due to the Doppler effect (change in the wavelength) as well as the effects due to the change in the refractive index within the tissue ( phase variation). As a result, a Doppler analysis of the light reflected by the tissue sample can produce erroneous results if the effects due to changes in the refractive index within the sample are not compensated. [00417] [00417] As previously described, Figure 29 illustrates an example of the effects on a Doppler analysis of the light falling on a tissue sample 2250 to determine the depth and location of an underlying blood vessel. If there is no intervening tissue between the blood vessel and the tissue surface, the interference pattern detected in the sensor may be due mainly to the change in the wavelength reflected from the moving blood cells. As a result, a 2252 spectrum derived from the interference pattern can generally reflect only the Doppler effect of blood cells. However, if there is intervening tissue between the blood vessel and the tissue surface, the interference pattern detected in the sensor may be due to a combination of the change in the wavelength reflected from the moving blood cells and the phase shift due to refractive index of the intervening tissue. A spectrum 2254 derived from such an interference pattern can result in the calculation of the Doppler effect that is confused due to the additional phase shift in the reflected light. In some respects, if information regarding the characteristics (thickness and refractive index) of the intervening tissue is known, the resulting spectrum 2256 can be corrected to provide a more accurate calculation of the change in wavelength. [00418] [00418] It can be recognized that the phase shift in reflected light from a tissue can provide additional information regarding the underlying tissue structures, regardless of Doppler effects. [00419] [00419] Figure 37 illustrates that the location and characteristics of non-vascular structures can be determined based on the phase difference between incident light 2372 and reflected light from deep tissue structures (2374, 2376, 2378). As noted above, the depth of light penetration that falls on a fabric is dependent on the wavelength of the incident illumination. Red laser light (which has a wavelength in the range of about 635 nm to about 660 nm) can penetrate tissue at a depth of about 1 mm. Green laser light (which has a wavelength in the range of about 520 nm to about 532 nm) can penetrate tissue at a depth of about 2 to 3 mm. Blue laser light (which has a wavelength in the range of about 405 nm to about 445 nm) can penetrate tissue at a depth of about 4 mm or more. In one aspect, an interface 2381a between two fabrics that differ in the refractive index that is situated less than or about 1 mm below a 2380 fabric surface may reflect 2374 red, green or blue laser light. The phase of reflected light 2374 can be compared to incident light 2372 and thus the difference in the tissue refractive index at the interface 2381a can be determined. In another aspect, an interface 2381b between two fabrics that differ in the refractive index that is between 2 and 3 mm 2381b below a 2380 fabric surface may reflect 2376 green or blue laser light, but not red light. The phase of reflected light 2376 can be compared to incident light 2372 and thus the difference in the tissue refractive index at interface 2381b can be determined. In yet another aspect, an interface 2381c between two fabrics that differ in the refractive index that is between 3 and 4 mm 2381c below a fabric surface 2380 may reflect 2378 only blue laser light, but not red or green light. The phase of reflected light 2378 can be compared to incident light 2372, and thus the difference in the tissue refractive index at the interface 2381c can be determined. [00420] [00420] The measurement of phase interference of a light-illuminated fabric with different wavelengths can therefore provide information on the relative refractive indices of the reflective fabric as well as on the depth of the fabric. The refractive indices of the tissue can be evaluated using multiple laser sources and their intensity, and thus the relative refractive indices for the tissue can be calculated. It is recognized that different fabrics can have different refractive indices. For example, the refractive index may be related to the relative composition of collagen and elastin in a tissue or the amount of hydration in the tissue. Therefore, a technique for measuring the relative refractive index of the tissue can result in the identification of a tissue composition. [00421] [00421] In some respects, smart surgical instruments include algorithms to determine the parameters associated with the function of the instruments. A non-limiting example of such parameters may be the pressure of an anvil against a fabric from an intelligent stapling device. The amount of pressure of an anvil against a tissue may depend on the type and composition of the tissue. For example, less pressure may be needed to staple a highly compressive fabric, while more pressure may be required to staple a less compressive fabric. Another non-limiting example of a parameter associated with an intelligent surgical device may include a firing rate for a beam knife with a profile in | to cut the fabric. For example, a rigid fabric may require more strength and a slower cut rate than a less rigid fabric. Another non-limiting example of such parameters may be the amount of current supplied to an electrode in a smart RF cauterization or sealing device. The composition of the fabric, as a percentage of tissue hydration, can determine an amount of current needed to heat seal the fabric. Yet another non-limiting example of such parameters may be the amount of energy supplied to an ultrasonic transducer of an intelligent ultrasonic cutting device or the frequency of activation of the cutting device. A rigid fabric may require more energy to be cut and contacting the ultrasonic cutting tool with a rigid fabric can shift the cutter's resonant frequency. [00422] [00422] It can be recognized that a tissue visualization system that can identify the type and depth of the tissue can provide this data to one or more intelligent surgical devices. The identification and location data can then be used by smart surgical devices to adjust one or more of its operational parameters, thus enabling them to optimize their tissue manipulation. It can be understood that an optical method to characterize a type of tissue can allow the automation of the operational parameters of smart surgical devices. Such automation of the operation of intelligent surgical instruments may be preferable to depending on the estimate of humans to determine the operational parameters of the instruments. [00423] [00423] In one aspect, optical coherence tomography (TCO) is a technique that can visualize subsurface tissue structures based on the phase difference between an illumination light source and the light reflected by the structures located within the tissue. Figure 38 schematically represents an example of 2470 instrumentation for optical coherence tomography. In Figure 38, a laser source 2472 can emit light 2482 according to any optical wavelength of interest (red, green, blue, infrared or ultraviolet). Light 2482 can be directed to a dichroic beam 2486. The dichroic beam 2486 directs a portion of light 2488 to a tissue sample 2480. The dichroic beam 2486 can also direct a portion of light 2492 to a stationary reference mirror [00424] [00424] “As described above, depth information about subsurface tissue structures can be determined from a combination of laser light wavelength and the reflected light phase of a deep tissue structure. Additionally, the inhomogeneity of the surface of a local tissue can be determined by comparing the phase and the difference in amplitude of the reflected light from different portions of the same subsurface tissues. Measurements of a difference in the surface properties of the tissue at a defined location compared to those at a neighboring site may be indicative of adhesions, disorganization of the tissue layers, infection or a neoplasm in the tissue being probed. [00425] [00425] Figure 39 illustrates this effect. The characteristics of the surface of a fabric determine the angle of reflection of the light that falls on the surface. A smooth surface 2551a reflects light with essentially the same spread 2544 as the light falling on surface 2542 (specular reflection). Consequently, the amount of light received by a light detector that has a known fixed aperture can effectively receive the entire amount of reflected light 2544 from the smooth surface 2551a. However, greater surface roughness on a fabric surface can result in a greater spread in reflected light compared to incident light (diffuse reflection). [00426] [00426] A certain amount of reflected light 2546 from a fabric surface that has a certain amount of surface irregularities 2551b will be outside the fixed opening of the light detector due to the increased spread of reflected light 2546. As a result, the light detector light will detect less light (shown in Figure 39 as a decrease in the amplitude of the reflected light signal 2546). It should be understood that the amount of reflected light propagation will increase as the surface roughness of a fabric increases. Thus, as shown in Figure 39, the amplitude of reflected light 2548 from a surface 2551c that has a significant surface roughness may have a smaller amplitude than reflected light 2544 from a smooth surface 2551a, or reflected light 2546 forms a surface that has only a moderate amount of surface roughness 2551b. Therefore, in some respects, a single laser source can be used to investigate the quality of a fabric surface or subsurface by comparing the optical properties of the reflected light from the fabric with the optical properties of the reflected light from adjacent surfaces. [00427] [00427] In other respects, light from multiple laser sources (for example, lasers that emit light with different central wavelengths) can be used sequentially to probe the characteristics of the tissue surface at a variety of depths below the surface 2550 As described above (with reference to Figure 37), the absorbance profile of a laser light in a fabric depends on the central wavelength of the laser light. Laser light that has a shorter central wavelength (more blue) can penetrate tissue more deeply than laser light that has a longer central wavelength (more red). Therefore, measurements related to diffuse light reflection made at different wavelengths of light can indicate an amount of surface roughness and the depth of the surface being measured. [00428] [00428] Figure 40 illustrates a method of displaying image processing data related to a combination of tissue visualization modalities. The data used on the screen can be derived from image phase data related to tissue layer composition, image intensity (amplitude) data related to tissue surface characteristics and image wavelength data related to tissue mobility (such as blood cell transport) and tissue depth. As an example, the light emitted by a laser in the blue 2562 optical region can fall on the blood flowing to a depth of about 4 mm below the surface of the tissue. The reflected light 2564 can be shifted to red due to the Doppler effect of blood flow. As a result, information regarding the existence of a blood vessel and its depth below the surface can be obtained. [00429] [00429] In another example, a layer of tissue may be at a depth of about 2 to 3 mm below the surface of the surgical site. This tissue may include surface irregularities indicative of scarring or other pathologies. The red light emitted 2572 may not penetrate to a depth of 2 to 3 mm, so consequently, the reflected red light 2580 may have approximately the same amplitude as the red light emitted 2572 because it is unable to probe structures located more than 1 mm below the top surface of the surgical site. However, the green light reflected by the fabric 2578 can reveal the existence of surface irregularities at that depth in the sense that the amplitude of the reflected green light 2578 may be less than the amplitude of the green light emitted 2570. Similarly, the blue light reflected from the fabric 2574 may reveal surface irregularities at that depth in the sense that the amplitude of the reflected blue light 2574 may be less than the amplitude of the blue light emitted 2562. In an example of an image processing step, the 2582 image can be smoothed with the use of a 2584 movable window filter to reduce noise between pixels as well as to reduce small local tissue anomalies 2586 that can hide more important features 2588. [00430] [00430] Figures 41A to 41C illustrate various aspects of displays that can be provided to a surgeon for visual identification of the surface and subsurface structures of a tissue at a surgical site. Figure 41A can represent a surface map of the surgical site with color coding to indicate structures located at different depths below the surface of the surgical site. Figure 41B shows an example of one of several horizontal cuts across the fabric at different depths, which can be color-coded to indicate depth and further include data associated with differences in fabric surface anomalies (for example, as shown in a 3D bar graph). Figure 41C represents yet another visual display in which surface irregularities as well as flow data from the Doppler effect can indicate vascular subsurface structures and tissue surface characteristics. [00431] [00431] Figure 42 is a 2950 flow chart of a method for providing information related to a tissue characteristic for an intelligent surgical instrument. An image capture system can illuminate 2960 a fabric with a first beam of light having a first central frequency and receive 2962 a first reflected light from the fabric illuminated by the first beam of light. The image capture system can then calculate 2964 a first surface feature at a first depth based on the first beam of light emitted and the first light reflected from the fabric. The image capture system can then illuminate the fabric 2966 with a second beam of light having a second central frequency and receive 2968 a second light reflected from the fabric illuminated by the first beam of light. The image capture system can then calculate a second surface feature based on a second depth in the second beam of light emitted and the second light reflected from the fabric. The characteristics of the fabric, which may include a type of fabric, a fabric composition, and a fabric surface roughness metric, can be determined from the first central light frequency, the second central light frequency, the first reflected light from of the fabric and the second light reflected from the fabric. The tissue characteristic can be used to calculate 2972 one or more parameters related to the function of an intelligent surgical instrument such as clamp pressure, energy to effect tissue cauterization or amplitude and / or current frequency to drive a piezoelectric actuator to cut a fabric. In some additional examples, the parameter can be transmitted 2974 directly or indirectly to the intelligent surgical instrument, which can modify its operational characteristics in response to the tissue being manipulated. [00432] [00432] In a minimally invasive procedure, for example, laparoscopic, a surgeon can visualize the surgical site with the use of imaging instruments that include a light source and a camera. Imaging instruments can allow the surgeon to view the end actuator of a surgical device during the procedure. However, the surgeon may need to view the tissue in the opposite direction from the end actuator to avoid involuntary damage during surgery. Such distant tissue may be out of sight of the camera system when focused on the end actuator. The imaging instrument can be moved to change the camera's field of view, but it can be difficult to return the camera system back to its original position after it has been moved. [00433] [00433] The surgeon may try to move the imaging system within the surgical site to view different portions of the site during the procedure. The repositioning of the imaging system takes time and the surgeon has no guarantee of viewing the same field of view of the surgical site when the imaging system is returned to its original location. [00434] [00434] Therefore, it is desirable to have a medical imaging visualization system that can provide multiple fields of view of the surgical site without the need to reposition the visualization system. Medical imaging devices include, without limitation, laparoscopes, endoscopes, thoracoscopes and the like, as described in the present invention. In some respects, a single display system can display each of the multiple fields of view at the surgical site at approximately the same time. The display of each of the multiple fields of view can be independently updated depending on a display control system consisting of one or more hardware modules, one or more software modules, one or more firmware modules or any combination or combinations of the themselves. [00435] [00435] - Some aspects of the present description also provide a control circuit configured to control the lighting of a surgical site with the use of one or more light sources, such as laser light sources, and to receive imaging data from one or more image sensors. In some aspects, the control circuit can be configured to control the operation of one or more light sensor modules to adjust a field of view. In some respects, the present description provides a non-transitory, computer-readable medium that stores computer-readable instructions that, when executed, cause a device to adjust one or more components of one or more light sensor modules and process an image of each one of the one or more light sensor modules. [00436] [00436] An aspect of a minimally invasive image capture system may comprise a plurality of light sources, with each light source being configured to emit light with a specified central wavelength, a first light detecting element that has a first field of view and is configured to receive reflected light from a first portion of the surgical site when the first portion of the surgical site is illuminated by at least one of the plurality of light sources, a second light sensing element that has a second field of view and is configured to receive reflected illumination from a second portion of the surgical site when the second portion of the surgical site is illuminated by at least one of the plurality of light sources, with the second field of view overlapping at least a portion of the first field of view; and a computing system. [00437] [00437] The computing system can be configured to receive data from the first light detection element, receive data from the second light detection element, compute imaging data based on the data received from the first detection element light and data received from the second light sensing element, and transmit the imaging data for reception by a display system. [00438] [00438] A variety of surgical visualization systems have been revealed above. Such systems provide visualization of tissue and tissue substructures that can be found during one or more surgical procedures. Non-limiting examples of such systems may include: systems for determining the location and depth of the subsurface vascular tissue, such as veins and arteries; systems for determining the amount of blood flowing through the subsurface vascular tissue; systems for determining the depth of non-vascular tissue structures; systems for characterizing the composition of such non-vascular tissue structures; and systems for characterizing one or more surface characteristics of such tissue structures. [00439] [00439] It can be recognized that a single surgical visualization system can incorporate components of any one or more of these visualization modalities. Figures 22A to 22D represent some examples of such a surgical visualization system [00440] [00440] “As presented above, in a non-limiting aspect, a 2108 surgical display system may include a 2002 imaging control unit and a 2020 hand unit. The 2020 hand unit may include a 2021 body, a scope cable with 2015 camera attached to the body 2021 and a probe with elongated camera [00441] [00441] & —Alternatively, the illumination of the surgical site may be cyclical between the visible light sources, as shown in Figure 30D. In some examples, light sources may include any one or more of a red 2360a laser, a green 2360b laser or a blue 2360c laser. In some non-limiting examples, a 2360a red laser light source can generate illumination that has a peak wavelength that can vary between 635 nm and 660 nm, inclusive. Non-limiting examples of a red laser peak wavelength may include about 635 nm, about 640 nm, about 645 nm, about 650 nm, about 655 nm, about 660 nm or any value or range of values between them. In some non-limiting examples, a 2360b green laser light source can generate illumination that has a peak wavelength that can range from 520 nm to 532 nm, inclusive. Non-limiting examples of a red laser peak wavelength may include about 520 nm, about 522 nm, about 524 nm, about 526 nm, about 528 nm, about 530 nm, about 532 nm or any value or range of values between them. In some non-limiting examples, the 2360c blue laser light source can generate illumination that has a peak wavelength that can vary between 405 nm and 445 nm, inclusive. Non-limiting examples of a blue laser peak wavelength may include about 405 nm, about 410 nm, about 415 nm, about 420 nm, about 425 nm, about 430 nm, about 435 nm, about 440 nm, about 445 nm or any value or range of values between them. [00442] [00442] - Additionally, the illumination of the surgical site can be cyclic to include non-visible light sources that can provide infrared or ultraviolet lighting. In some non-limiting examples, an infrared laser light source can generate illumination that has a peak wavelength that can range between 750 nm and 3000 nm, inclusive. Non-limiting examples of a peak infrared laser wavelength may include about 750 nm, about 1,000 nm, about 1,250 nm, about [00443] [00443] The sensor matrix outputs under different illumination wavelengths can be combined to form the RGB image, for example, if the illumination cycle time is fast enough and the laser light is in the visible range. Figures 43A and 43B illustrate a light sensor with multiple pixels receiving light reflected by an illuminated fabric, for example, by sequential exposure to red, green, blue, infrared light (Figure 43A) or red, green, laser light sources. blue and ultraviolet (Figure 43B). [00444] [00444] Figure 44A represents the distal end of a 2120 flexible elongated camera probe with a flexible camera drive shaft 2122 and a single light sensor module 2124 disposed at the distal end 2123 of the flexible camera probe drive shaft 2122. In some non-limiting examples, the drive shaft of the 2122 flexible camera probe can have an outside diameter of about 5 mm. The outside diameter of the drive shaft of the flexible camera probe 2122 may depend on geometric factors that may include, without limitation, the amount of flexion allowed on the drive shaft at the distal end 2123. As shown in Figure 44A, the distal end 2123 of the drive shaft of the flexible camera probe 2122 can bend about 90º with respect to a longitudinal geometry axis of a non-curved portion of the drive shaft of the flexible camera probe 2122 located at a proximal end of the probe with an extended camera 2120. Can the distal end 2123 of the drive shaft of the flexible camera probe 2122 can bend in any suitable amount as may be necessary for its function. In this way, as non-limiting examples, the distal end 2123 of the drive shaft of the flexible camera probe 2122 can bend in any amount between about 0 ° and about 90 °. Non-limiting examples of the flexion angle of the distal end 2123 of the drive shaft of the flexible camera probe 2122 may include about 0º, about 10º, about 20º, about 30º, about 40º, about 50º, about 60º , about 70º, about 80º, about 90º or any value or range of values between them. In some instances, the flexion angle of the distal end 2123 of the flexible camera probe drive shaft 2122 can be set by a surgeon or other healthcare professional before or during a surgical procedure. In some other examples, the angle of flexion of the distal end 2123 of the drive shaft of the flexible camera probe 2122 can be fixed angle defined at a manufacturing site. [00445] [00445] The only light sensor module 2124 can receive light reflected from the fabric when illuminated by a light emitted by one or more light sources 2126 arranged at the distal end of the probe with elongated camera. In some examples, the light sensor module 2124 may be a 4 mm sensor module, such as a 21 mm bead bezel, as shown in Figure 22D. It can be recognized that the 2124 light sensor module can be any size suitable for its intended function. Thus, the light sensor module 2124 can include a 5.5 mm bezel 2136a, a 2.7 mm bezel 2136c or a 2 mm bezel 2136d, as shown in Figure 22D. [00446] [00446] It can be recognized that one or more 2126 light sources may include any number of 2126 light sources including, without limitation, one light source, two light sources, three light sources, four light sources or more of four lighting sources. It can further be understood that each light source can provide illumination with any central wavelength including a central red illumination wavelength, a central green illumination wavelength, a central blue illumination wavelength, a wavelength central infrared illumination, a central wavelength of ultraviolet illumination or any other wavelength. In some examples, one or more 2126 light sources may include a white light source, which can illuminate the fabric with light that has wavelengths that can cover the range of optical white light from about 390 nm to about 700 nm. [00447] [00447] Figure 44B represents the distal end 2133 of an alternative elongated camera probe 2130 with multiple light sensor modules, for example, the two light sensor modules 2134a and 2134b, each disposed at the distal end 2133 of the probe with 2130 elongated camera. In some non-limiting examples, the alternative 2130 elongated camera probe can have an outside diameter of about 7 mm. In some examples, the light sensor modules 2134a and 2134b may each comprise a 4 mm sensor module, similar to the light sensor module 2124 in Figure 44A. Alternatively, each of the light sensor modules 2134a and 2134b may comprise a 5.5 mm light sensor module, a 2.7 mm light sensor module or a 2 mm light sensor module, as appropriate. represented in Figure 22D. In some examples, both light sensor modules 2134a and 2134b may be the same size. In some examples, the light sensor modules 2134a and 2134b may have different sizes. As a non-limiting example, an alternative 2130 elongated camera probe may have a first 4 mm light sensor and two additional 2 mm light sensors. In some ways, a visualization system can combine the optical outputs of the multiple light sensor modules 2134a and 2134b to form a 3D or near 3D image of the surgical site. In some other respects, the outputs of the multiple light sensor modules 2134a and 2134b can be combined in order to improve the optical resolution of the surgical site, which may otherwise not be practical with just a single light sensor module. [00448] [00448] Each of the multiple light sensor modules 2134a and 2134b can receive reflected light from the fabric when illuminated by a light emitted by one or more light sources 2136a and 2136b arranged at the distal end 2133 of the probe with elongated alternative camera 2130. In some non-limiting examples, the light emitted by all light sources 2136a and 2136b can be derived from the same light source (such as a laser). In other non-limiting examples, light sources 2136a surrounding a first light sensor module 2134a can emit light at a first wavelength and light sources 2136b surrounding a second light sensor module 2134b can emit light at a second wavelength. It can further be understood that each light source 2136a and 2136b can provide illumination with any central wavelength including a central red illumination wavelength, a central green illumination wavelength, a central blue illumination wavelength, a central wavelength of infrared illumination, a central wavelength of ultraviolet illumination or any other wavelength. In some examples, the one or more light sources 2136a and 2136b may include a white light source, which can illuminate the fabric with light that has wavelengths that can cover the range of optical white light from about 390 nm to about 700 nm. [00449] [00449] In some additional aspects, the distal end 2133 of the 2130 elongated camera probe may include one or more working channels 2138. Such working channels 2138 may be in fluid communication with a suction port of a device for aspirating material from the surgical site, thus allowing the removal of material that can potentially obscure the field of view of the light sensor modules 2134a and 2134b. Alternatively, such 2138 working channels may be in fluid communication with a fluid source port of a device to supply fluid to the surgical site, to purge debris or material away from the surgical site. Such fluids can be used to clear material from the field of view of the light sensor modules 2134a and 2134b. [00450] [00450] Figure 44C represents a perspective view of an aspect of a 2160 monolithic sensor that has a plurality of pixel arrays to produce a three-dimensional image according to the teachings and principles of the description. Such an implementation may be desirable for capturing three-dimensional images, since the two pixel arrays 2162 and 2164 can be shifted during use. In another implementation, a first pixel array 2162 and a second pixel array 2164 can be dedicated to receive a predetermined range of wavelengths of electromagnetic radiation, with the first pixel array 2162 being dedicated to a range of electromagnetic radiation wavelength different from the second pixel array 2164. [00451] [00451] Additional descriptions of a dual sensor matrix can be found in US patent application publication No. 2014/0267655, entitled SUPER RESOLUTION AND COLOR [00452] [00452] In some respects, a light sensor module may comprise a multi-pixel light sensor, such as a CMOS matrix, in addition to one or more additional optical elements, such as a lens, a reticle and a filter. [00453] [00453] In some alternative aspects, the one or more light sensors may be located inside the 2021 body of the manual unit [00454] [00454] The images obtained from each of the multiple light sensors, for example, 2134a and 2134b, can be combined or processed in several different ways, in combination or separately, and then displayed in order to allow a surgeon to view different aspects of the surgical site. [00455] [00455] In a non-limiting example, each light sensor can have an independent field of view. In some additional examples, the field of view of a first light sensor may partially or completely overlap the field of view of a second light sensor. [00456] [00456] As shown above, an imaging system may include a manual unit 2020 that has a 2024 elongated camera probe with one or more light sensor modules 2124, 2134a and 2134b arranged at its distal end 2123, 2133. As a For example, the 2024 elongated camera probe may have two light sensor modules 2134a and 2134b, although it can be recognized that there may be three, four, five or more light sensor modules at the distal end of the elongated camera probe [00457] [00457] Figure 45 represents a general view of a distal end 2143 of an elongated camera probe that has multiple light sensor modules 2144a and 2144b. Each light sensor module 2144a and 2144b can be composed of a CCD or CMOS sensor and one or more optical elements such as filters, lenses, shutters and the like. In some respects, the components of the light sensor modules 2144a and 2144b can be fixed inside the probe with an elongated camera. In some other respects, one or more of the components of the light sensor modules 2144a and 2144b may be adjustable. For example, the CCD or CMOS sensor of a light sensor module 2144a and 2144b can be mounted on a movable bezel to allow automated adjustment of the center 2145a and 2145b of a field of view 2147a and 2147b of the CCD or CMOS sensor. In some other aspects, the CCD or CMOS sensor can be fixed, but a lens on each 2144a and 2144b light sensor module can be adjustable to change the focus. In some respects, the light sensor modules 2144a and 2144b may include adjustable irises to allow changes in the visual opening of the sensor modules 2144a and 2144b. [00458] [00458] “As shown in Figure 45, each of the sensor modules 2144a and 2144b can have a field of view 2147a and 2147b with an acceptance angle. As shown in Figure 45, the acceptance angle for each of the sensor modules 2144a and 2144b can have an acceptance angle greater than 90º. In some examples, the acceptance angle can be around 100º. In some examples, the acceptance angle can be around 120º. In some examples, if the sensor modules 2144a and 2144b have an acceptance angle greater than 90º (for example, 100º), fields of view 2147a and 2147b can form an overlap region 2150a and 2150b. In some respects, an optical field of view that has an acceptance angle of 100 ° or more can be called a "fisheye" field of view. A visualization system control system associated with such an elongated camera probe may include computer-readable instructions that may allow the display of the overlapping region 2150a and 2150b in such a way that the extreme curvature of the overlapping fisheye fields of view is corrected, and an improved and flattened image can be displayed. In Figure 45, the overlapping region 2150a can represent a region where the overlapping fields of view 2147a and 2147b of the sensor modules 2144a and 2144b have their respective centers 2145a and 2145b directed in a forward direction. However, if any one or more components of the sensor modules 2144a and 2144b are adjustable, it can be recognized that the overlap region 2150b can be aimed at any angle attainable within the fields of view 2147a and 2147b of the sensor modules 2144a and 2144b . [00459] [00459] Figures 46A to 46D represent a variety of examples of an elongated light probe with two light sensor modules 2144a and 2144b with a variety of fields of view. The elongated light probe can be directed to view a 2152 surface of a surgical site. [00460] [00460] In Figure 46A, the first light sensor module 2144a has a first field of view of sensor 2147a of a fabric surface 2154a, and the second light sensor module 2144b has a second field of view of sensor 2147b of a 2154b fabric surface. As shown in Figure 46A, the first field of view 2147a and the second field of view 2147b have approximately the same angle of view. In addition, the first field of view of sensor 2147a is in an adjacent position, but does not overlap, with the second field of view of sensor 2147b. The image received by the first light sensor module 2144a can be displayed separately from the image received by the second light sensor module 2144b, or the images can be combined to form a single image. In some non-limiting examples, the viewing angle of a lens associated with the first light sensor module 2144a and the viewing angle of a lens associated with the second light sensor module 2144b can be somewhat narrow, and the image distortion may not be good on the periphery of their respective images. Therefore, images can be easily combined from edge to edge. [00461] [00461] As shown in Figure 46B, the first field of view 2147a and the second field of view 2147b have approximately the same angular field of view, and the first 2147a sensor field of view completely overlaps the second sensor field of view 2147b. This can result in a first sensor field of view 2147a of a fabric surface 2154a being identical to the view of a fabric surface 2154b as obtained by the second light sensor module 2144b from the second sensor field of view 2147b. This configuration can be useful for applications in which the image of the first light sensor module 2144a can be processed differently from the image of the second light sensor module 2144b. The information in the first image can complement the information in the second image and refer to the same piece of fabric. [00462] [00462] “As shown in Figure 46C, the first field of view 2147a and the second field of view 2147b have approximately the same angular field of view, and the first 2147a sensor field of view partially overlaps the second field of view of 2147b sensor. In some non-limiting examples, a lens associated with the first light sensor module 2144a and a lens associated with the second light sensor module 2144b can be wide-angle lenses. These lenses can allow you to view a wider field of view than shown in Figure 46A. Wide-angle lenses are known to have significant optical distortion at their periphery. Proper image processing of the images obtained by the first light sensor module 2144a and the second light sensor module 2144b can allow the formation of a combined image in which the central portion of the combined image is corrected for any distortion induced by the first lens or the second lens. It can be understood that a portion of the first sensor field of view 2147a of a fabric surface 2154a may therefore have some distortion due to the wide-angle nature of a lens associated with the first light sensor module 2144a, and a portion of the second field of view of sensor 2147b of a fabric surface 2154b may therefore have some distortion due to the wide-angle nature of a lens associated with the second light sensor module 2144b. However, a portion of the fabric seen in the overlapping region 2150 'of the two light sensor modules 2144a and 2144b can be corrected for any distortion induced by any of the light sensor modules 2144a and 2144b. The configuration shown in Figure 46C can be useful for applications where you want to have a wide field of view of the tissue around a portion of a surgical instrument during a surgical procedure. In some examples, the lenses associated with each light sensor module 2144a and 2144b may be independently controllable, thereby controlling the location of the overlapping region 2150 'of the view within the combined image. [00463] [00463] “As shown in Figure 46D, the first light sensor module 2144a can have a first angular field of view 2147a that is wider than the second angular field of view 2147b of the second light sensor module 2144b. In some non-limiting examples, the second field of view of sensor 2147b can be fully arranged within the first field of view of sensor 2147a. In alternative examples, the second sensor field of view may be outside or tangent to the wide angle field of view 2147a of the first sensor 2144a. A display system that can use the configuration shown in Figure 46D can display a wide-angle fabric portion 2154a imaged by the first sensor module 2144a along with a second enlarged fabric portion 2154b imaged by the second sensor module 2144b and is located at a 2150 "overlap region of the first field of view 2147a and the second field of view 2147b. This configuration can be useful for presenting a surgeon with an image close to the tissue adjacent to a surgical instrument (for example, embedded in the second tissue portion 2154b) and a wide field image of the tissue around the immediate adjacency of the medical instrument (for example, the first portion of proximal tissue 2154a) .In some non-limiting examples, the image presented by the second narrower field of view 2147b of the second light sensor module 2144b can be a surface image of the surgical site. In some additional examples, the image shown in the prime The first wide field of view 2147a of the first light sensor module 2144a may include a display based on a hyperspectral analysis of the tissue viewed in the wide field of view. [00464] [00464] Figures 47A to 47C illustrate an example of using an imaging system that incorporates the features revealed in Figure 46D. Figure 47A schematically illustrates a proximal view 2170 at the distal end of the elongated camera probe representing arrays of light sensors 2172a and 2172b of the two light sensor modules 2174a and 2174b. A first light sensor module 2174a may include a wide-angle lens, and the second light sensor module 2174b may include a narrow-angle lens. In some respects, the second 2174b light sensor module may have a narrow aperture lens. In other respects, the second light sensor module 2174b may have a magnifying glass. The fabric can be illuminated by the light sources arranged at the distal end of the probe with an elongated camera. The light sensor arrays 2172 '(the light sensor array 2172a or 2172b, or both 2172a and 2172b) can receive the light reflected by the fabric through illumination. The fabric can be illuminated by light from a red laser source, a green laser source, a blue laser source, an infrared laser source and / or an ultraviolet laser source. In some respects, light sensor arrays 2172 'can sequentially receive red laser light 2175a, green laser light 2175b, blue laser light 2175c, infrared laser light 2175d and ultraviolet laser light 2175e. The fabric can be illuminated by any combination of such laser sources simultaneously, as shown in Figures 23E and 23F. Alternatively, the illumination light may be cyclical between any combination of such laser sources, as shown, for example, in Figures 23D and Figures 43A and 43B. [00465] [00465] Figure47B schematically represents a portion of lung tissue 2180 that may contain a 2182 tumor. The 2182 tumor may be in communication with blood vessels, including one or more 2184 veins and / or arteries 2186. In some surgical procedures, the vessels blood vessels (veins 2184 and arteries 2186) associated with the 2182 tumor may need resection and / or cauterization before removal of the tumor. [00466] [00466] Figure 47C illustrates the use of a double imaging system, as described above with respect to Figure 47A. The first light sensor module 2174a can capture a wide-angle image of the tissue surrounding a blood vessel 2187 to be cut with a surgical knife 2190. The wide-angle image can allow the surgeon to check the blood vessel to be separated 2187 In addition, the second light sensor module 2174b can capture a narrow angle image of the specific blood vessel 2187 to be manipulated. The narrow-angle image can show the surgeon the progress of manipulating the blood vessel 2187. In this way, the surgeon receives the image of the vascular tissue to be manipulated as well as its surroundings to ensure that the right blood vessel is being manipulated. [00467] [00467] Figures 48A and 48B represent another example of the use of a double imaging system. Figure 48A shows a primary surgical screen providing an image of a section of a surgical site. The primary surgical screen can show a 2800 wide view image of a section of intestine 2802 along with its 2804 vasculature. The 2800 wide view image can include a portion of surgical field 2809 that can be displayed separately as a 2810 enlarged view in one secondary surgical mesh (Figure 48B). As shown above with respect to surgery to remove a tumor from a lung (Figures 47A to 47C), it may be necessary to dissect the blood vessels that feed a 2806 tumor before removing the cancerous tissue. The vasculature 2804 that feeds the intestines 2802 is complex and highly branched. It may be necessary to determine which blood vessels feed the 2806 tumor and to identify the blood vessels that supply blood to healthy intestinal tissue. The 2800 wide view image allows a surgeon to determine which blood vessel can feed the 2806 tumor. The surgeon can then test a blood vessel with the use of a 2812 grasping device to determine whether the blood vessel feeds the 2806 tumor or do not. [00468] [00468] Figure 48B shows a secondary surgical screen that can display only a narrow magnified view image 2810 of a portion of the surgical field 2809. The narrow magnified view image 2810 can show a close view of the vascular tree 2814 so that the surgeon can concentrate on dissecting only the blood vessel of interest 2815. To perform the resection of the blood vessel of interest 2815, a surgeon can use a 2816 intelligent RF cauterization device. It can be understood that any image obtained through the Visualization can include not only images of the tissue at the surgical site, but also images of the surgical instruments inserted into it. In some respects, such a surgical screen (the primary screen in Figure 48A or the secondary screen in Figure 48B) may also include symbols 2817 related to the functions or settings of any surgical device used during the surgical procedure. For example, the 2817 symbols can include a power configuration from the 2816 smart RF cauterization device. In some respects, such smart medical devices can transmit data related to their operational parameters to the visualization system so that it incorporates them into safety data. to be transmitted to one or more display devices. [00469] [00469] Figures 49A to 49C illustrate examples of a sequence of surgical steps for the removal of an intestinal / colon tumor and which may benefit from the use of multiple image analysis at the surgical site. Figure 49A shows a portion of the surgical site, including the 2932 intestines and the 2934 branched vasculature that supplies blood and nutrients to the 2932 intestines. The 2932 intestines may have a 2936 tumor surrounded by a 2937 tumor margin. The first sensor module of light from a viewing system can have a wide 2930 field of view and can provide imaging data from the wide 2930 field of view to a display system. The second light sensor module of the display system can have a narrow or standard 2940 field of view and can provide 2940 narrow field of view imaging data to the display system. In some ways, the wide-field image and the narrow-field image can be displayed by the same display device. In another aspect, the wide-field image and the narrow-field image can be displayed by separate devices. [00470] [00470] During the surgical procedure, it may be important to remove not only the 2936 tumor, but the 2937 margin surrounding it to ensure complete removal of the tumor. A wide-angle field of view 2930 can be used to image both the 2934 vasculature and the 2932 intestine section surrounding the 2936 tumor and the 2637 margin. As noted above, the vasculature that feeds the 2936 tumor and the 2637 margin. it must be removed, but the vasculature that feeds the surrounding intestinal tissue must be preserved to provide oxygen and nutrients to the surrounding tissue. The transection of the vasculature that feeds the surrounding colon tissue will remove oxygen and nutrients from the tissue, leading to necrosis. In some examples, Doppler laser imaging of tissue viewed in the wide angle field 2630 can be analyzed to provide a contrast analysis of speckle 2933, indicating blood flow within the intestinal tissue. [00471] [00471] Figure 49B illustrates a step during the surgical procedure. The surgeon may not be sure which part of the vascular tree supplies blood to the 2936 tumor. The surgeon can test a 2944 blood vessel to determine whether it feeds the 2936 tumor or healthy tissue. The surgeon can attach a 2944 blood vessel with a 2812 grasping device and determine the section of intestinal tissue 2943 that is no longer perfused by speckle contrast analysis. The narrow field of view 2940 displayed on an imaging device can assist the surgeon with the close view and detailed work that is required to view the single 2944 blood vessel to be tested. When the suspected 2944 blood vessel is arrested, it is determined that a portion of the 2943 intestinal tissue will lack perfusion based on the speckle contrast analysis of Doppler imaging. As shown in Figure 29B, the suspect blood vessel 2944 does not supply blood to the tumor 2935 or to the margin of the tumor 2937, and is therefore recognized as a blood vessel that must be spared during the surgical procedure. [00472] [00472] Figure 49C represents a next stage of the surgical procedure. At the stage, a 2984 supply blood vessel was identified to supply blood to the 2937 margin of the tumor. When this 2984 supply blood vessel was separated, blood is no longer supplied to a section of the 2987 intestine that may include at least a portion of the 2937 margin of the 2936 tumor. In some respects, the lack of perfusion to the 2987 intestine section it can be determined by means of a speckle contrast analysis based on a Doppler analysis of blood flow to the intestine. The non-perfused section of the 2987 intestines can then be isolated by means of a 2985 seal applied to the intestine. In this way, only the blood vessels that perfuse the tissue indicated for surgical removal can be identified and sealed, thus saving healthy tissues from unintended surgical consequences. [00473] [00473] In some additional aspects, a surgical visualization system can allow the imaging analysis of the surgical site. [00474] [00474] In some aspects, the surgical site can be inspected for the effectiveness of surgical manipulation of a tissue. Non-limiting examples of such inspections may include the inspection of surgical clamps or welds used to seal tissue at a surgical site. A coherent cone beam tomography using one or more light sources can be used for such methods. [00475] [00475] In some additional aspects, an image of a surgical site may have reference points indicated in the image. In some examples, reference points can be determined using image analysis techniques. In some alternative examples, the reference points can be indicated by manual intervention of the image by the surgeon. [00476] [00476] In some additional aspects, non-intelligent ready-made visualization methods can be imported for use in image fusion techniques in the central controller. [00477] [00477] In additional aspects, instruments that are not integrated into the central controller system can be identified and tracked during their use within the surgical site. In this regard, computational and / or storage components of the central controller or any of its components (including, for example, in the cloud-based system) may include a database of EES-related images and competitive surgical instruments that are identifiable from one or more images captured through any image capture system or through visual analysis of such alternative instruments. The imaging analysis of such devices can also allow the identification of when an instrument is replaced by a different instrument to do the same job or a similar job. Identifying the replacement of an instrument during a surgical procedure can provide information related to when an instrument is not doing the job or information about a device failure. Situational recognition [00478] [00478] Situational recognition is the ability of some aspects of a surgical system to determine or infer information related to a surgical procedure from data received from databases and / or instruments. The information may include the type of procedure being performed, the type of tissue being operated on or the body cavity that is the object of the procedure. With contextual information related to the surgical procedure, the surgical system can, for example, improve its way of controlling modular devices (for example, a robotic arm and / or robotic surgical instrument) that are connected to it and provide contextualized information or suggestions to the surgeon during the course of the surgical procedure. [00479] [00479] Figure 50 shows a timeline 5200 representing the situational recognition of a central controller, such as the central surgical controller 106 or 206, for example. Timeline 5200 is an illustrative surgical procedure and the contextual information that the central surgical controller 106, 206 can derive from data received from data sources at each stage in the surgical procedure. Timeline 5200 represents the typical steps that would be taken by nurses, surgeons, and other medical personnel during the course of a pulmonary segmentectomy procedure, starting with the setup of the operating room and ending with the transfer of the patient to an operating room. postoperative recovery. [00480] [00480] Situational recognition of a central surgical controller 106, 206 receives data from data sources throughout the course of the surgical procedure, including data generated each time medical personnel use a modular device that is paired with the operating room 106 , 206. Central surgical controller 106, 206 can receive this data from paired modular devices and other data sources and continually derive inferences (ie, contextual information) about the ongoing procedure as new data is received, such as procedure step is being performed at a given time. The situational recognition system of the central surgical controller 106, 206 is capable of, for example, recording data related to the procedure to generate reports, checking the steps being taken by medical personnel, providing data or warnings (for example, through a display) that may be relevant to the specific step of the procedure, adjust the modular devices based on the context (for example, activate monitors, adjust the field of view (FOV) of the medical imaging device, or change the energy level of a ultrasonic surgical instrument or RF electrosurgical instrument), and take any other action described above. [00481] [00481] In the first step 5202, in this illustrative procedure, members of the hospital team retrieve the electronic patient record (PEP) from the hospital's PEP database. Based on patient selection data in the PEP, the central surgical controller 106, 206 determines that the procedure to be performed is a thoracic procedure. [00482] [00482] “In the second step 5204, the team members scan the entry of medical supplies for the procedure. Central surgical controller 106, 206 cross-references the scanned supplies with a list of supplies that are used in various types of procedures and confirms that the supply mix corresponds to a thoracic procedure. In addition, the central surgical controller 106, 206 is also able to determine that the procedure is not a wedge procedure (because the inlet supplies have an absence of certain supplies that are necessary for a thoracic wedge procedure or, otherwise, that inlet supplies do not correspond to a thoracic wedge procedure). [00483] [00483] In the third step 5206, medical personnel scan the patient's band with a scanner that is communicably connected to the central surgical controller 106, 206. The central surgical controller 106, 206 can then confirm the patient's identity based on the scanned data. [00484] [00484] In the fourth step 5208, the medical staff turns on the auxiliary equipment. The auxiliary equipment being used may vary according to the type of surgical procedure and the techniques to be used by the surgeon, but in this illustrative case they include a smoke evacuator, an insufflator and a medical imaging device. When activated, auxiliary equipment that is modular devices can automatically pair with the central surgical controller 106, 206 which is located within a specific neighborhood of modular devices as part of their initialization process. The central surgical controller 106, 206 can then derive contextual information about the surgical procedure by detecting the types of modular devices that correspond with it during that preoperative or initialization phase. In this particular example, the central surgical controller 106, 206 determines that the surgical procedure is a VATS (video-assisted thoracic surgery) procedure based on this specific combination of paired modular devices. Based on the combination of data from the electronic patient record (PEP), the list of medical supplies to be used in the procedure, and the type of modular devices that connect to the central controller, the central surgical controller 106, 206 can, in general , infer the specific procedure that the surgical team will perform. After the central surgical controller 106, 206 recognizes which specific procedure is being performed, the central surgical controller 106, 206 can then retrieve the steps of that process from a memory or from the cloud and then cross the data it subsequently receives from the connected data sources (for example, modular devices and patient monitoring devices) to infer which stage of the surgical procedure the surgical team is performing. [00485] [00485] In the fifth step 5210, the team members fix the electrocardiogram (ECG) electrodes and other patient monitoring devices on the patient. ECG electrodes and other patient monitoring devices are able to pair with central surgical controller 106, 206. As central surgical controller 106, 206 begins to receive data from patient monitoring devices, central surgical controller 106, 206 thus confirming that the patient is in the operating room. [00486] [00486] In the sixth step 5212, medical personnel induced anesthesia in the patient. Central surgical controller 106, 206 can infer that the patient is under anesthesia based on data from modular devices and / or patient monitoring devices, including ECG data, blood pressure data, ventilator data, or combinations of themselves, for example. After the completion of the sixth step 5212, the preoperative portion of the lung segmentectomy procedure is completed and the operative portion begins. [00487] [00487] In the seventh step 5214, the lung of the patient being operated on is retracted (while ventilation is switched to the contralateral lung). The central surgical controller 106, 206 can infer from the ventilator data that the patient's lung has been retracted, for example. Central surgical controller 106, 206 can infer that the operative portion of the procedure started when it can compare the detection of the patient's lung collapse at the expected stages of the procedure (which can be accessed or retrieved earlier) and thus determine that the retraction of the patient lung is the first operative step in this specific procedure. [00488] [00488] In the eighth step 5216, the medical imaging device (for example, a display device) is inserted and the video from the medical imaging device is started. [00489] [00489] In the ninth step 5218 of the procedure, the surgical team starts the dissection step. Central surgical controller 106, 206 can infer that the surgeon is in the process of dissecting to mobilize the patient's lung because he receives data from the RF or ultrasonic generator that indicate that an energy instrument is being fired. The central surgical controller 106, 206 can cross-check the received data with the steps retrieved from the surgical procedure to determine that an energy instrument being fired at that point in the process (that is, after the completion of the previously discussed steps of the procedure) corresponds to the step of dissection. In certain cases, the energy instrument may be a power tool mounted on a robotic arm in a robotic surgical system. [00490] [00490] In the tenth step 5220 of the procedure, the surgical team proceeds to the connection step. Central surgical controller 106, 206 can infer that the surgeon is ligating the arteries and veins because he receives data from the surgical stapling and cutting instrument indicating that the instrument is being fired. Similar to the previous step, the central surgical controller 106, 206 can derive this inference by crossing the reception data of the stapling and surgical cutting instrument with the steps recovered in the process. In certain cases, the surgical instrument can be a surgical tool mounted on a robotic arm of a robotic surgical system. [00491] [00491] In the eleventh step 5222, the segmentectomy portion of the procedure is performed. Central surgical controller 106, 206 can infer that the surgeon is transecting the parenchyma based on data from the surgical stapling and cutting instrument, including data from its cartridge. The cartridge data can correspond to the size or type of clamp being triggered by the instrument, for example. As different types of staples are used for different types of fabrics, the cartridge data can thus indicate the type of fabric being stapled and / or transected. In this case, the type of clamp that is fired is used for the parenchyma (or other similar types of tissue), which allows the central surgical controller 106, 206 to infer which segmentectomy portion of the procedure is being performed. [00492] [00492] In the twelfth step 5224, the node dissection step is then performed. The central surgical controller 106, 206 can infer that the surgical team is dissecting the node and performing a leak test based on the data received from the generator that indicates which ultrasonic or RF instrument is being fired. For this specific procedure, an RF or ultrasonic instrument being used after the parenchyma has been transected corresponds to the node dissection step, which allows the central surgical controller 106, 206 to make this inference. It should be noted that surgeons regularly switch between surgical stapling / cutting instruments and surgical energy instruments (that is, RF or ultrasonic) depending on the specific step in the procedure because different instruments are better adapted for specific tasks. Therefore, the specific sequence in which the cutting / stapling instruments and surgical energy instruments are used can indicate which stage of the procedure the surgeon is performing. In addition, in certain cases, robotic tools can be used for one or more steps in a surgical procedure and / or hand-held surgical instruments can be used for one or more steps in the surgical procedure. The surgeon can switch between robotic tools and hand-held surgical instruments and / or can use the devices simultaneously, for example. After the completion of the twelfth stage 5224, the incisions are closed and the post-operative portion of the process begins. [00493] [00493] In the thirteenth stage 5226, the patient's anesthesia is reversed. The central surgical controller 106, 206 can infer that the patient is emerging from anesthesia based on ventilator data (i.e., the patient's respiratory rate begins to increase), for example. [00494] [00494] Finally, in the fourteenth step 5228 is that medical personnel remove the various patient monitoring devices from the patient. Central surgical controller 106, 206 can thus infer that the patient is being transferred to a recovery room when the central controller loses ECG, blood pressure and other data from patient monitoring devices. As can be seen from the description of this illustrative procedure, the central surgical controller 106, 206 can determine or infer when each step of a given surgical procedure is taking place according to the data received from the various data sources that are communicably coupled to the controller central surgery 106, 206. [00495] [00495] Situational recognition is further described in US provisional patent application serial number 62 / 611,341, entitled INTERACTIVE SURGICAL PLATFORM, filed on December 28, 2017, the description of which is incorporated herein by reference in its entirety. In certain cases, the operation of a robotic surgical system, including the various robotic surgical systems disclosed here, for example, can be controlled by the central controller 106, 206 based on its situational perception and / or feedback from its components and / or based on information from cloud 102. [00496] [00496] Various aspects of the subject described in this document are defined in the following numbered examples. [00497] [00497] Example 1. A minimally invasive image capture system comprising: a plurality of light sources with each light source being configured to emit light that has a specified central wavelength; a first light sensing element that has a first field of view and is configured to receive reflected illumination from a first portion of a surgical site when the first portion of the surgical site is illuminated by at least one of the plurality of light sources ; a second light sensing element that has a second field of view and is configured to receive reflected illumination from a second portion of a surgical site when the second portion of the surgical site is illuminated by at least one of the plurality of light sources , with the second field of view overlapping at least a portion of the first field of view; and a computing system, the computing system being configured to receive data from the first light detecting element, receiving data from the second light detecting element, computing imaging data based on the data received from the first light sensing element and data received from the second light sensing element and transmitting the imaging data for reception by a display system. [00498] [00498] Example 2. The minimally invasive image capture system of Example 1, in which the first field of view has a first angle and the second field of view has a second angle and the first angle is equal to the second angle. [00499] [00499] Example 3. The minimally invasive image capture system of any of Examples 1 and 2, in which the first field of view has a first angle and the second field of view has a second angle and the first angle is different from the second angle. [00500] [00500] Example 4. The minimally invasive image capture system of any of Examples 1 to 3, in which the first light detection element has an optical component configured to adjust the first field of view. [00501] [00501] Example 5. The minimally invasive image capture system of any of Examples 1 to 4, in which the second light detection element has an optical component configured to adjust the second field of view. [00502] [00502] “Example 6. The minimally invasive image capture system of any of Examples 1 to 5, in which the second field of view overlaps the entire first field of view. [00503] [00503] “Example 7.The minimally invasive image capture system of any of Examples 1 to 6, in which the first field of view is completely surrounded by the second field of view. [00504] [00504] Example 38. The minimally invasive image capture system of any of Examples 1 to 7, wherein the first light sensing element and the second light sensing element are at least partially arranged inside a probe with elongated camera. [00505] [00505] “Example 9. The minimally invasive image capture system of any of Examples 1 to 8, in which one of the plurality of light sources is configured to emit light that has a specified central wavelength within a visible spectrum. [00506] [00506] “Example 10.The minimally invasive image capture system of any of Examples 1 to 9, in which at least one of the plurality of light sources is configured to emit light that has a specified central wavelength outside of a visible spectrum. [00507] [00507] “Example 11.The minimally invasive image capture system of Example 10, in which the specified central wavelength outside the visible spectrum is within an ultraviolet range. [00508] [00508] Example 12.The minimally invasive image capture system of any of Examples 10 and 11, wherein the specified central wavelength outside the visible spectrum is within an infrared range. [00509] [00509] “Example 13.The minimally invasive image capture system of any of Examples 1 to 12, in which the computing system configured to compute the imaging data based on the data received from the first light detection element and the data received from the second light sensing element comprises a computing system configured to perform a first data analysis on the data received from the first light sensing element and a second analysis on the data received from the second light sensing element. [00510] [00510] Example 14. The minimally invasive image capture system of Example 13, in which the first data analysis differs from the second data analysis. [00511] [00511] Example 15. A minimally invasive image capture system comprising: a processor and a memory attached to the processor, the memory storing instructions executable by the processor to: control an operation of a plurality of light sources from a sample of fabric with each light source being configured to emit light with a specified central wavelength; receive, from a first light detection element, first data related to the reflected illumination of a first portion of the surgical site when the first portion of the surgical site is illuminated by at least one of the plurality of lighting sources, receive, from a second light detecting element, second data related to the reflected illumination of a second portion of the surgical site when the second portion of the surgical site is illuminated by at least one of the plurality of lighting sources, with the second field of view overlapping at least a portion of the first field of view, compute imaging data based on the first data received from the first light sensing element and the second data received from the second light sensing element and transmitting the imaging data for reception by a system display. [00512] [00512] Example 16. The minimally invasive image capture system of Example 15, in which the memory attached to the processor still stores instructions executable by the processor to receive, from a surgical instrument, operational data related to a function or state of the surgical instrument. [00513] [00513] Example 17.The minimally invasive image capture system of Example 16, in which the memory attached to the processor still stores instructions executable by the processor to compute imaging data based on the first data received from the first light detection element, in the second data received from the second light detection element and in the operational data related to the function or condition of the surgical instrument. [00514] [00514] Example 18. A minimally invasive image capture system comprising: a control circuit configured to: control the operation of a plurality of light sources from a tissue sample and each light source is configured to emit light with a specified central wavelength; receive, from a first light detection element, first data related to the reflected illumination of a first portion of the surgical site when the first portion of the surgical site is illuminated by at least one of the plurality of lighting sources, receive, from a second light detecting element, second data related to the reflected illumination of a second portion of the surgical site when the second portion of the surgical site is illuminated by at least one of the plurality of lighting sources, with the second field of view overlapping at least a portion of the first field of view, compute imaging data based on the first data received from the first light sensing element and the second data received from the second light sensing element and transmitting the imaging data for reception by a system display. [00515] [00515] Example 19. A non-transitory, computer-readable medium that stores computer-readable instructions that, when executed, cause a machine to: control an operation from a plurality of light sources from a tissue sample, with each source of lighting is configured to emit light with a specified central wavelength; receive, from a first light detection element, first data related to the reflected illumination of a first portion of the surgical site when the first portion of the surgical site is illuminated by at least one of the plurality of lighting sources, receive, from a second light detecting element, second data related to the reflected illumination of a second portion of the surgical site when the second portion of the surgical site is illuminated by at least one of the plurality of lighting sources, with the second field of view overlapping at least a portion of the first field of view, compute imaging data based on the first data received from the first light sensing element and the second data received from the second light sensing element and transmitting the imaging data for reception by a system display. [00516] [00516] Although several forms have been illustrated and described, it is not the applicant's intention to restrict or limit the scope of the claims attached to such detail. Numerous modifications, variations, alterations, substitutions, combinations and equivalents of these forms can be implemented and will occur to those skilled in the art without departing from the scope of the present description. In addition, the structure of each element associated with the shape can alternatively be described as a means of providing the function performed by the element. In addition, where materials are revealed for certain components, other materials can be used. It should be understood, therefore, that the preceding description and the appended claims are intended to cover all these modifications, combinations and variations that fall within the scope of the modalities presented. The attached claims are intended to cover all such modifications, variations, alterations, substitutions, modifications and equivalents. [00517] [00517] The previous detailed description presented various forms of devices and / or processes through the use of block diagrams, flowcharts and / or examples. Although these block diagrams, flowcharts and / or examples contain one or more functions and / or operations, it will be understood by those skilled in the art that each function and / or operation within these block diagrams, flowcharts and / or examples can be implemented, individually and / or collectively, through a wide range of hardware, software, firmware or virtually any combination thereof. Those skilled in the art will recognize, however, that some aspects of the aspects disclosed herein, in whole or in part, can be implemented in an equivalent manner in integrated circuits, such as one or more computer programs running on one or more computers (for example, as one or more programs running on one or more computer systems), as one or more programs running on one or more processors (for example, as one or more programs running on one or more microprocessors), as firmware, or virtually as any combination of them, and that designing the circuitry and / or writing the code for the software and firmware would be within the scope of practice of those skilled in the art, in light of this description. In addition, those skilled in the art will understand that the mechanisms of the subject described herein can be distributed as one or more program products in a variety of ways and that an illustrative form of the subject described here is applicable regardless of the specific type of transmission medium. signals used to effectively carry out the distribution. [00518] [00518] The instructions used to program the logic to execute various revealed aspects can be stored in a memory in the system, such as dynamic random access memory (DRAM), cache, flash memory or other storage. In addition, instructions can be distributed over a network or through other computer-readable media. Thus, machine-readable media can include any mechanism to store or transmit information in a machine-readable form (for example, a computer), but is not limited to, floppy disks, optical discs, read-only compact disc ( CD-ROMs), and optical-dynamos discs, read-only memory (ROM), random access memory (RAM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), cards magnetic or optical, flash memory, or a machine-readable tangible storage media used to transmit information over the Internet via an electrical, optical, acoustic cable or other forms of propagation signals (for example, carrier waves, infrared signal, digital signals, etc.). Consequently, computer-readable non-transitory media includes any type of machine-readable media suitable for storing or transmitting instructions or electronic information in a machine-readable form (for example, a computer). [00519] [00519] “As used in any aspect of the present invention, the term" control circuit "can refer to, for example, a set of wired circuits, programmable circuits [for example, a computer processor comprising one or more individual instruction processing cores, processing unit, processor, - microcontroller, microcontroller unit, controller, digital signal processor (PSD), programmable logic device (PLD), programmable logic matrix (PLA), or programmable port arrangement in field (FPGA) [, state machine circuits, firmware that stores instructions executed by the programmable circuit, and any combination thereof. The control circuit can, collectively or individually, be incorporated as an electrical circuit that is part of a larger system, for example, an integrated circuit (IC), an application-specific integrated circuit (ASIC), an on-chip system (SoC ) desktop computers, laptop computers, tablet computers, servers, smart headsets, etc. Accordingly, as used in the present invention, "control circuit" includes, but is not limited to, electrical circuits that have at least one discrete electrical circuit, electrical circuits that have at least one integrated circuit, electrical circuits that have at least one circuit integrated for specific application, electrical circuits that form a general purpose computing device configured by a computer program (for example, a general purpose computer configured by a computer program that at least partially performs processes and / or devices described herein, or a microprocessor configured by a computer program that at least partially performs the processes and / or devices described herein), electrical circuits that form a memory device (for example, forms of random access memory), and / or electrical circuits that form a communications device (for example, a modem, communication key optic-electrical equipment). Those skilled in the art will recognize that the subject described here can be implemented in an analog or digital way, or in some combination of these. [00520] [00520] “As used in any aspect of the present invention, the term" logic "can refer to an application, software, firmware and / or circuit configured to perform any of the aforementioned operations. The software may be incorporated as a software package, code, instructions, instruction sets and / or data recorded on the computer-readable non-transitory storage media. The firmware can be embedded as code, instructions or instruction sets and / or data that are hard-coded (for example, non-volatile) in memory devices. [00521] [00521] As used in any aspect of the present invention, the terms "component", "system", "module" and the like may refer to a computer-related entity, be it hardware, a combination of hardware and software, software or software running. [00522] [00522] - As used here in one aspect of the present invention, an "algorithm" refers to the self-consistent sequence of steps that lead to the desired result, where a "step" refers to the manipulation of physical quantities and / or logical states that can , although they do not necessarily need to, take the form of electrical or magnetic signals that can be stored, transferred, combined, compared and manipulated in any other way. It is common use to call these signs bits, values, elements, symbols, characters, terms, numbers or the like. These terms and similar terms may be associated with the appropriate physical quantities and are merely convenient identifications applied to these quantities and / or states. [00523] [00523] “A network can include a packet-switched network. Communication devices may be able to communicate with each other using a selected packet switched network communications protocol. An exemplary communications protocol may include an Ethernet communications protocol that may be able to allow communication using a transmission control protocol / Internet protocol (TCP / IP). The Ethernet protocol can conform to or be compatible with the Ethernet standard published by the Institute of Electrical and Electronics Engineers (IEEE) entitled "IEEE 802.3 Standard", published in December 2008 and / or later versions of this standard. Alternatively or in addition, communication devices may be able to communicate with each other using an X.25 communications protocol. The X.25 communications protocol can conform or be compatible with a standard promulgated by the International Telecommunication Union-Telecommunication Standardization Sector (ITU-T). Alternatively or in addition, communication devices may be able to communicate with each other using a frame-relay communications protocol. The frame-relay communications protocol can conform to or be compatible with a standard promulgated by the Consultative Committee for International Telegraph and Telephone (CCITT) and / or the American National Standards Institute (ANSI). Alternatively or additionally, transceivers may be able to communicate with each other using an ATM communication protocol ("asynchronous transfer mode"). The ATM communication protocol can conform to or be compatible with an ATM standard published by the ATM forum entitled "ATM-MPLS Network Interworking 2.0" published in August 2001, and / or later versions of that standard. Obviously, different and / or post-developed connection-oriented network communication protocols are also contemplated in the present invention. [00524] [00524] Unless otherwise stated, as is evident from the preceding description, it is understood that, throughout the preceding description, discussions using terms such as "processing" or "computation" or "calculation" or "determination" or "display" or similar refers to the action and processes of a computer, or similar electronic computing device, that manipulates and transforms the data represented in the form of physical (electronic) quantities in the computer's records and memories into other represented data similarly in the form of physical quantities in the memories or records of the computer, or in other similar devices for storing, transmitting or displaying information. [00525] [00525] One or more components in the present invention may be called "configured for", "configurable for", "operable / operational for", "adapted / adaptable for", "capable of", "conformable / conformed for", etc. Those skilled in the art will recognize that "configured for" may, in general, cover components in an active state and / or components in an inactive state and / or components in a standby state, except when the context dictates otherwise. [00526] [00526] The terms "proximal" and "distal" are used in the present invention with reference to a physician who manipulates the handle portion of the surgical instrument. The term "proximal" refers to the portion closest to the doctor, and the term "distal" refers to the portion located opposite the doctor. It will also be understood that, for the sake of convenience and clarity, spatial terms such as "vertical", "horizontal", "up" and "down" can be used in the present invention with respect to the drawings. However, surgical instruments can be used in many orientations and positions, and these terms are not intended to be limiting and / or absolute. [00527] [00527] Persons skilled in the art will recognize that, in general, the terms used here, and especially in the appended claims (for example, bodies of the appended claims) are generally intended as "open" terms (for example, the term "including" should be interpreted as "including, but not limited to", the term "having" should be interpreted as "having at least", the term "includes" should be interpreted as "includes, but is not limited to ", etc.). It will also be understood by those skilled in the art that, when a specific number of a claim statement entered is intended, that intention will be expressly mentioned in the claim and, in the absence of such mention, no intention will be present. For example, as an aid to understanding, the following appended claims may contain the use of the introductory phrases "at least one" and "one or more" to introduce claim statements. However, the use of such phrases should not be interpreted as implying that the introduction of a claim statement by the indefinite articles "one, ones" or "one, ones" limits any specific claim containing the mention of the claim entered to claims that contain only such a mention, even when the same claim includes the introductory phrases "one or more" or "at least one" and indefinite articles, such as "one, ones" or "one, ones" (for example, "one, ones" and / or "one, ones" should typically be interpreted as meaning "at least one" or "one or more"); the same goes for the use of defined articles used to introduce claims. [00528] [00528] Furthermore, even if a specific number of an introduced claim statement is explicitly mentioned, those skilled in the art will recognize that that statement must typically be interpreted as meaning at least the number mentioned (for example, the mere mention of "two mentions ", without other modifiers, typically means at least two mentions, or two or more mentions). In addition, in cases where a convention analogous to "at least one of A, B and C, etc." is used, in general this construction is intended to have the meaning in which the convention would be understood by (for example, "a system that has at least one of A, B and C "would include, but not be limited to, systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and / or A, B and C together, etc.). In cases where a convention analogous to "at least one of A, B or C, etc." is used, in general this construction is intended to have the meaning in which the convention would be understood by (for example, "a system that have at least one of A, B and C "would include, but not be limited to, systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and / or A , B and C together, etc.). It will also be understood by those skilled in the art that typically a disjunctive word and / or phrase presenting two or more alternative terms, whether in the description, in the claims or in the drawings, should be understood as contemplating the possibility of including one of the terms, any of the terms or both terms, except where the context dictates something different. For example, the phrase "A or B" will typically be understood to include the possibilities of "A" or "B" or "AeB". [00529] [00529] “With respect to the attached claims, those skilled in the art will understand that the operations mentioned in them can, in general, be executed in any order. In addition, although several operational flow diagrams are presented in one or more sequences, it must be understood that the various operations can be performed in other orders than those shown, or can be performed simultaneously. Examples of such alternative orderings may include ordering - overlapping, merging, interrupted, reordered, incremental, preparatory, supplementary, simultaneous, inverse or other variant orders, unless the context otherwise determines. In addition, terms such as "responsive to", "related to" or other adjectival participles are not intended in general to exclude these variants, unless the context otherwise requires. [00530] [00530] It is worth noting that any reference to "one (1) aspect", "one aspect", "an exemplification" or "one (1) exemplification", and the like means that a particular resource, structure or characteristic described in connection with the aspect is included in at least one aspect. Thus, the use of expressions such as "in one (1) aspect", "in one aspect", "in an exemplification" "," in one (1) exemplification ", in several places throughout this specification does not refer necessarily the same aspect. In addition, specific resources, structures or characteristics can be combined in any appropriate way in one or more aspects. [00531] [00531] Any patent application, patent, non-patent publication or other description material mentioned in this specification and / or mentioned in any order data sheet is hereby incorporated by reference, to the extent that the materials incorporated are not inconsistent with that. Accordingly, and to the extent necessary, the description as explicitly presented herein replaces any conflicting material incorporated by reference to the present invention. Any material, or portion thereof, which is incorporated herein by reference, but which conflicts with the definitions, statements, or other description materials contained herein, will be incorporated here only to the extent that there is no conflict between the embedded material and the existing description material. [00532] [00532] In short, numerous benefits have been described that result from the use of the concepts described in this document. The previously mentioned description of one or more modalities has been presented for purposes of illustration and description. This description is not intended to be exhaustive or to limit the invention to the precise form disclosed. Modifications or variations are possible in light of the above teachings. One or more modalities were chosen and described in order to illustrate the principles and practical application to, thus, allow those skilled in the art to use the various modalities and with various modifications, as they are convenient to the specific use contemplated. It is intended that the claims presented in the annex define the global scope.
权利要求:
Claims (19) [1] 1. Minimally invasive image capture system, characterized by comprising: a plurality of light sources, with each light source being configured to emit light that has a specified central wavelength; a first light sensing element that has a first field of view and is configured to receive reflected illumination from a first portion of a surgical site when the first portion of the surgical site is illuminated by at least one of the plurality of light sources ; a second light sensing element that has a second field of view and is configured to receive reflected illumination from a second portion of a surgical site when the second portion of the surgical site is illuminated by at least one of the plurality of light sources , with the second field of view overlapping at least a portion of the first field of view; and a computing system, the computing system being configured to: receive data from the first light detection element, receive data from the second light detection element, compute the imaging data based on the data received from the first light detection element. light detection and data received from the second light detection element, and transmit the imaging data for reception by a display system. [2] 2. Minimally invasive image capture system, according to claim 1, characterized in that the first field of view has a first angle and the second field of view has a second angle, and the first angle is equal to the second angle. [3] 3. Minimally invasive image capture system, according to claim 1, characterized in that the first field of view has a first angle and the second field of view has a second angle, and because the first angle is different from the second angle. [4] 4, Minimally invasive image capture system, according to claim 1, characterized in that the first light detection element has an optical component configured to adjust the first field of view. [5] 5. Minimally invasive image capture system, according to claim 1, characterized in that the second light detection element has an optical component configured to adjust the second field of view. [6] 6. Minimally invasive image capture system, according to claim 1, characterized in that the second field of view overlaps the entire first field of view. [7] 7. Minimally invasive image capture system, according to claim 1, characterized in that the first field of view is completely surrounded by the second field of view. [8] 8. Minimally invasive image capture system according to claim 1, characterized in that the first light sensing element and the second light sensing element are at least partially arranged within an elongated camera probe. [9] 9. Minimally invasive image capture system, according to claim 1, characterized in that each of the Plurality of light sources is configured to emit light that has a specified central wavelength within the visible spectrum. [10] 10. Minimally invasive image capture system, according to claim 1, characterized in that at least one of the plurality of light sources is configured to emit light that has a specified central wavelength outside the visible spectrum. [11] 11. Minimally invasive image capture system, according to claim 10, characterized in that the specified central wavelength outside the visible spectrum is within an ultraviolet range. [12] 12. Minimally invasive image capture system, according to claim 10, characterized in that the specified central wavelength outside the visible spectrum is within an infrared range. [13] 13. Minimally invasive image capture system, according to claim 1, characterized in that the computing system configured to compute the imaging data based on the data received from the first light sensing element and the data received from the second imaging element light detection comprises a computing system configured to perform a first data analysis on data received from the first light detection element and a second analysis on data received from the second light detection element. [14] 14. Minimally invasive image capture system, according to claim 13, characterized in that the first data analysis differs from the second data analysis. [15] 15. Minimally invasive image capture system, characterized by comprising: a processor; and a memory attached to the processor, the memory stores instructions executable by the processor to: controlling the operation of a plurality of light sources from a tissue sample, each light source being configured to emit light that has a specified central wavelength; receiving, from a first light detecting element, first data related to reflected illumination from a first portion of a surgical site when the first portion of the surgical site is illuminated by at least one of the plurality of lighting sources, to receive, of a second light detecting element, second data related to the reflected illumination of a second portion of a surgical site when the second portion of the surgical site is illuminated by at least one of the plurality of light sources, the second of which is vision overlaps at least a portion of the first field of view, compute the imaging data based on the first data received from the first light detection element and the second data received from the second light detection element, and transmit the data from imaging for reception by a display system. [16] 16. Minimally invasive image capture system, according to claim 15, characterized in that the memory attached to the processor still stores instructions executable by the processor to receive, from a surgical instrument, operational data related to a function or an instrument state surgical. [17] 17. Minimally invasive image capture system, according to claim 16, characterized in that the memory coupled to the processor still stores instructions executable by the processor to compute imaging data based on the first data received from the first light detection element, in second data received from the second light detection element and operational data related to the function or condition of the surgical instrument. [18] 18. Minimally invasive image capture system, characterized by comprising: a control circuit configured to: control the operation of a plurality of light sources from a tissue sample, with each light source being configured to emit light that has a specified central wavelength; receiving, from a first light detecting element, first data related to reflected illumination from a first portion of a surgical site when the first portion of the surgical site is illuminated by at least one of the plurality of lighting sources, to receive, of a second light detecting element, second data related to the reflected illumination of a second portion of a surgical site when the second portion of the surgical site is illuminated by at least one of the plurality of light sources, the second of which is vision overlaps at least a portion of the first field of view, compute the imaging data based on the first data received from the first light detection element and the second data received from the second light detection element, and transmit the data from imaging for reception by a display system. [19] 19. Non-transient computer-readable media, characterized by storing computer-readable instructions that, when executed, make a machine: controlling the operation of a plurality of light sources from a tissue sample, each light source being configured to emit a Juz that has a specified central wavelength; receiving, from a first light-detecting element, first data related to reflected illumination from a first portion of a surgical site when the first portion of the surgical site is illuminated by at least one of the plurality of lighting sources, receive, from a second light detection element, second data related to the reflected illumination of a second portion of a surgical site when the second portion of the surgical site is illuminated by at least one of the plurality of light sources, the second being field of view overlaps at least a portion of the first field of view, compute the imaging data based on the first data received from the first light sensing element and the second data received from the second light sensing element, and transmitting the imaging data for reception by a display system.
类似技术:
公开号 | 公开日 | 专利标题 US11100631B2|2021-08-24|Use of laser light and red-green-blue coloration to determine properties of back scattered light US20210212602A1|2021-07-15|Dual cmos array imaging US20190200905A1|2019-07-04|Characterization of tissue irregularities through the use of mono-chromatic light refractivity US11213359B2|2022-01-04|Controllers for robot-assisted surgical platforms US20190201021A1|2019-07-04|Surgical instrument having a flexible circuit US20190201079A1|2019-07-04|Surgical instrument having a flexible electrode US11132462B2|2021-09-28|Data stripping method to interrogate patient records and create anonymized record US20210205029A1|2021-07-08|Computer implemented interactive surgical systems BR112020012896A2|2020-12-08|SELF-DESCRIPTIVE DATA PACKAGES GENERATED IN AN EMISSION INSTRUMENT BR112020013138A2|2020-12-01|data pairing to interconnect a measured parameter from a device with a result BR112020012849A2|2020-12-29|CENTRAL COMMUNICATION CONTROLLER AND STORAGE DEVICE FOR STORAGE AND STATE PARAMETERS AND A SURGICAL DEVICE TO BE SHARED WITH CLOUD-BASED ANALYSIS SYSTEMS BR112020012966A2|2020-12-01|drive arrangements for robot-assisted surgical platforms BR112020012806A2|2020-11-24|aggregation and reporting of data from a central surgical controller BR112020012556A2|2020-11-24|surgical instrument that has a flexible electrode BR112020013233A2|2020-12-01|capacitive coupled return path block with separable matrix elements BR112020012718A2|2020-12-01|surgical instrument that has a flexible circuit
同族专利:
公开号 | 公开日 US20210212602A1|2021-07-15| JP2021509304A|2021-03-25| US20190200906A1|2019-07-04| WO2019130074A1|2019-07-04| CN111542251A|2020-08-14| EP3505041A1|2019-07-03|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题 US8027710B1|2005-01-28|2011-09-27|Patrick Dannan|Imaging system for endoscopic surgery| US7995045B2|2007-04-13|2011-08-09|Ethicon Endo-Surgery, Inc.|Combined SBI and conventional image processor| US7982776B2|2007-07-13|2011-07-19|Ethicon Endo-Surgery, Inc.|SBI motion artifact removal apparatus and method| DE602009001103D1|2008-06-04|2011-06-01|Fujifilm Corp|Lighting device for use in endoscopes| JP5216429B2|2008-06-13|2013-06-19|富士フイルム株式会社|Light source device and endoscope device| EP2391259A1|2009-01-30|2011-12-07|The Trustees Of Columbia University In The City Of New York|Controllable magnetic source to fixture intracorporeal apparatus| US8986302B2|2009-10-09|2015-03-24|Ethicon Endo-Surgery, Inc.|Surgical generator for ultrasonic and electrosurgical devices| JP5606120B2|2010-03-29|2014-10-15|富士フイルム株式会社|Endoscope device| US9516239B2|2012-07-26|2016-12-06|DePuy Synthes Products, Inc.|YCBCR pulsed illumination scheme in a light deficient environment| US9743016B2|2012-12-10|2017-08-22|Intel Corporation|Techniques for improved focusing of camera arrays| US10098527B2|2013-02-27|2018-10-16|Ethidcon Endo-Surgery, Inc.|System for performing a minimally invasive surgical procedure| US20140263552A1|2013-03-13|2014-09-18|Ethicon Endo-Surgery, Inc.|Staple cartridge tissue thickness sensor system| AU2014233193B2|2013-03-15|2018-11-01|DePuy Synthes Products, Inc.|Controlling the integral light energy of a laser pulse| EP2967294B1|2013-03-15|2020-07-29|DePuy Synthes Products, Inc.|Super resolution and color motion artifact correction in a pulsed color imaging system| US10687884B2|2015-09-30|2020-06-23|Ethicon Llc|Circuits for supplying isolated direct current voltage to surgical instruments| US20170296213A1|2016-04-15|2017-10-19|Ethicon Endo-Surgery, Llc|Systems and methods for controlling a surgical stapling and cutting instrument|US20070084897A1|2003-05-20|2007-04-19|Shelton Frederick E Iv|Articulating surgical stapling instrument incorporating a two-piece e-beam firing mechanism| US8215531B2|2004-07-28|2012-07-10|Ethicon Endo-Surgery, Inc.|Surgical stapling instrument having a medical substance dispenser| US9237891B2|2005-08-31|2016-01-19|Ethicon Endo-Surgery, Inc.|Robotically-controlled surgical stapling devices that produce formed staples having different lengths| US11246590B2|2005-08-31|2022-02-15|Cilag Gmbh International|Staple cartridge including staple drivers having different unfired heights| US7669746B2|2005-08-31|2010-03-02|Ethicon Endo-Surgery, Inc.|Staple cartridges for forming staples having differing formed staple heights| US11224427B2|2006-01-31|2022-01-18|Cilag Gmbh International|Surgical stapling system including a console and retraction assembly| US8186555B2|2006-01-31|2012-05-29|Ethicon Endo-Surgery, Inc.|Motor-driven surgical cutting and fastening instrument with mechanical closure system| US11207064B2|2011-05-27|2021-12-28|Cilag Gmbh International|Automated end effector component reloading system for use with a robotic system| US7845537B2|2006-01-31|2010-12-07|Ethicon Endo-Surgery, Inc.|Surgical instrument having recording capabilities| US8684253B2|2007-01-10|2014-04-01|Ethicon Endo-Surgery, Inc.|Surgical instrument with wireless communication between a control unit of a robotic system and remote sensor| US8931682B2|2007-06-04|2015-01-13|Ethicon Endo-Surgery, Inc.|Robotically-controlled shaft based rotary drive systems for surgical instruments| US9585657B2|2008-02-15|2017-03-07|Ethicon Endo-Surgery, Llc|Actuator for releasing a layer of material from a surgical end effector| US9386983B2|2008-09-23|2016-07-12|Ethicon Endo-Surgery, Llc|Robotically-controlled motorized surgical instrument| US8210411B2|2008-09-23|2012-07-03|Ethicon Endo-Surgery, Inc.|Motor-driven surgical cutting instrument| US8517239B2|2009-02-05|2013-08-27|Ethicon Endo-Surgery, Inc.|Surgical stapling instrument comprising a magnetic element driver| US20110024477A1|2009-02-06|2011-02-03|Hall Steven G|Driven Surgical Stapler Improvements| US9861361B2|2010-09-30|2018-01-09|Ethicon Llc|Releasable tissue thickness compensator and fastener cartridge having the same| US9072535B2|2011-05-27|2015-07-07|Ethicon Endo-Surgery, Inc.|Surgical stapling instruments with rotatable staple deployment arrangements| RU2636861C2|2012-06-28|2017-11-28|Этикон Эндо-Серджери, Инк.|Blocking of empty cassette with clips| US9364230B2|2012-06-28|2016-06-14|Ethicon Endo-Surgery, Llc|Surgical stapling instruments with rotary joint assemblies| US11197671B2|2012-06-28|2021-12-14|Cilag Gmbh International|Stapling assembly comprising a lockout| KR102278509B1|2012-07-26|2021-07-19|디퍼이 신테스 프로덕츠, 인코포레이티드|Continuous video in a light deficient environment| CA2878513A1|2012-07-26|2014-01-30|Olive Medical Corporation|Wide dynamic range using monochromatic sensor| RU2669463C2|2013-03-01|2018-10-11|Этикон Эндо-Серджери, Инк.|Surgical instrument with soft stop| US9629629B2|2013-03-14|2017-04-25|Ethicon Endo-Surgey, LLC|Control systems for surgical instruments| MX369362B|2013-08-23|2019-11-06|Ethicon Endo Surgery Llc|Firing member retraction devices for powered surgical instruments.| US20150053746A1|2013-08-23|2015-02-26|Ethicon Endo-Surgery, Inc.|Torque optimization for surgical instruments| US11259799B2|2014-03-26|2022-03-01|Cilag Gmbh International|Interface systems for use with surgical instruments| JP6612256B2|2014-04-16|2019-11-27|エシコンエルエルシー|Fastener cartridge with non-uniform fastener| BR112017004361A2|2014-09-05|2017-12-05|Ethicon Llc|medical overcurrent modular power supply| US9757128B2|2014-09-05|2017-09-12|Ethicon Llc|Multiple sensors with one sensor affecting a second sensor's output or interpretation| BR112017005981A2|2014-09-26|2017-12-19|Ethicon Llc|surgical staplers and ancillary materials| US9924944B2|2014-10-16|2018-03-27|Ethicon Llc|Staple cartridge comprising an adjunct material| US11141153B2|2014-10-29|2021-10-12|Cilag Gmbh International|Staple cartridges comprising driver arrangements| US11154301B2|2015-02-27|2021-10-26|Cilag Gmbh International|Modular stapling assembly| US9993248B2|2015-03-06|2018-06-12|Ethicon Endo-Surgery, Llc|Smart sensors with local signal processing| US10245033B2|2015-03-06|2019-04-02|Ethicon Llc|Surgical instrument comprising a lockable battery housing| US10299878B2|2015-09-25|2019-05-28|Ethicon Llc|Implantable adjunct systems for determining adjunct skew| US10368865B2|2015-12-30|2019-08-06|Ethicon Llc|Mechanisms for compensating for drivetrain failure in powered surgical instruments| US10292704B2|2015-12-30|2019-05-21|Ethicon Llc|Mechanisms for compensating for battery pack failure in powered surgical instruments| US10265068B2|2015-12-30|2019-04-23|Ethicon Llc|Surgical instruments with separable motors and motor control circuits| US11213293B2|2016-02-09|2022-01-04|Cilag Gmbh International|Articulatable surgical instruments with single articulation link arrangements| US11224426B2|2016-02-12|2022-01-18|Cilag Gmbh International|Mechanisms for compensating for drivetrain failure in powered surgical instruments| US10335145B2|2016-04-15|2019-07-02|Ethicon Llc|Modular surgical instrument with configurable operating mode| US10456137B2|2016-04-15|2019-10-29|Ethicon Llc|Staple formation detection mechanisms| US11179150B2|2016-04-15|2021-11-23|Cilag Gmbh International|Systems and methods for controlling a surgical stapling and cutting instrument| US10368867B2|2016-04-18|2019-08-06|Ethicon Llc|Surgical instrument comprising a lockout| US11179155B2|2016-12-21|2021-11-23|Cilag Gmbh International|Anvil arrangements for surgical staplers| JP2020501779A|2016-12-21|2020-01-23|エシコン エルエルシーEthicon LLC|Surgical stapling system| US11191539B2|2016-12-21|2021-12-07|Cilag Gmbh International|Shaft assembly comprising a manually-operable retraction system for use with a motorized surgical instrument system| US11160551B2|2016-12-21|2021-11-02|Cilag Gmbh International|Articulatable surgical stapling instruments| US20180168618A1|2016-12-21|2018-06-21|Ethicon Endo-Surgery, Llc|Surgical stapling systems| US10675026B2|2016-12-21|2020-06-09|Ethicon Llc|Methods of stapling tissue| US10307170B2|2017-06-20|2019-06-04|Ethicon Llc|Method for closed loop control of motor velocity of a surgical stapling and cutting instrument| US11266405B2|2017-06-27|2022-03-08|Cilag Gmbh International|Surgical anvil manufacturing methods| US11141154B2|2017-06-27|2021-10-12|Cilag Gmbh International|Surgical end effectors and anvils| US11246592B2|2017-06-28|2022-02-15|Cilag Gmbh International|Surgical instrument comprising an articulation system lockable to a frame| US20190000474A1|2017-06-28|2019-01-03|Ethicon Llc|Surgical instrument comprising selectively actuatable rotatable couplers| US11259805B2|2017-06-28|2022-03-01|Cilag Gmbh International|Surgical instrument comprising firing member supports| US11141160B2|2017-10-30|2021-10-12|Cilag Gmbh International|Clip applier comprising a motor controller| US11134944B2|2017-10-30|2021-10-05|Cilag Gmbh International|Surgical stapler knife motion controls| US11229436B2|2017-10-30|2022-01-25|Cilag Gmbh International|Surgical system comprising a surgical tool and a surgical hub| US11103268B2|2017-10-30|2021-08-31|Cilag Gmbh International|Surgical clip applier comprising adaptive firing control| US11090075B2|2017-10-30|2021-08-17|Cilag Gmbh International|Articulation features for surgical end effector| US11197670B2|2017-12-15|2021-12-14|Cilag Gmbh International|Surgical end effectors with pivotal jaws configured to touch at their respective distal ends when fully closed| US11071543B2|2017-12-15|2021-07-27|Cilag Gmbh International|Surgical end effectors with clamping assemblies configured to increase jaw aperture ranges| US11076853B2|2017-12-21|2021-08-03|Cilag Gmbh International|Systems and methods of displaying a knife position during transection for a surgical instrument| US10743868B2|2017-12-21|2020-08-18|Ethicon Llc|Surgical instrument comprising a pivotable distal head| US10758310B2|2017-12-28|2020-09-01|Ethicon Llc|Wireless pairing of a surgical device with another device within a sterile surgical field based on the usage and situational awareness of devices| US11257589B2|2017-12-28|2022-02-22|Cilag Gmbh International|Real-time analysis of comprehensive cost of all instrumentation used in surgery utilizing data fluidity to track instruments through stocking and in-house processes| US20190205001A1|2017-12-28|2019-07-04|Ethicon Llc|Sterile field interactive control displays| US11202570B2|2017-12-28|2021-12-21|Cilag Gmbh International|Communication hub and storage device for storing parameters and status of a surgical device to be shared with cloud based analytics systems| US10944728B2|2017-12-28|2021-03-09|Ethicon Llc|Interactive surgical systems with encrypted communication capabilities| US10695081B2|2017-12-28|2020-06-30|Ethicon Llc|Controlling a surgical instrument according to sensed closure parameters| US11096693B2|2017-12-28|2021-08-24|Cilag Gmbh International|Adjustment of staple height of at least one row of staples based on the sensed tissue thickness or force in closing| US10987178B2|2017-12-28|2021-04-27|Ethicon Llc|Surgical hub control arrangements| US11056244B2|2017-12-28|2021-07-06|Cilag Gmbh International|Automated data scaling, alignment, and organizing based on predefined parameters within surgical networks| US20190206551A1|2017-12-28|2019-07-04|Ethicon Llc|Spatial awareness of surgical hubs in operating rooms| US10892995B2|2017-12-28|2021-01-12|Ethicon Llc|Surgical network determination of prioritization of communication, interaction, or processing based on system or device needs| US11076921B2|2017-12-28|2021-08-03|Cilag Gmbh International|Adaptive control program updates for surgical hubs| US11266468B2|2017-12-28|2022-03-08|Cilag Gmbh International|Cooperative utilization of data derived from secondary sources by intelligent surgical hubs| US11132462B2|2017-12-28|2021-09-28|Cilag Gmbh International|Data stripping method to interrogate patient records and create anonymized record| US11045591B2|2017-12-28|2021-06-29|Cilag Gmbh International|Dual in-series large and small droplet filters| US11100631B2|2017-12-28|2021-08-24|Cilag Gmbh International|Use of laser light and red-green-blue coloration to determine properties of back scattered light| US11179208B2|2017-12-28|2021-11-23|Cilag Gmbh International|Cloud-based medical analytics for security and authentication trends and reactive measures| US11147607B2|2017-12-28|2021-10-19|Cilag Gmbh International|Bipolar combination device that automatically adjusts pressure based on energy modality| US11013563B2|2017-12-28|2021-05-25|Ethicon Llc|Drive arrangements for robot-assisted surgical platforms| US10943454B2|2017-12-28|2021-03-09|Ethicon Llc|Detection and escalation of security responses of surgical instruments to increasing severity threats| US10849697B2|2017-12-28|2020-12-01|Ethicon Llc|Cloud interface for coupled surgical devices| US10892899B2|2017-12-28|2021-01-12|Ethicon Llc|Self describing data packets generated at an issuing instrument| US11234756B2|2017-12-28|2022-02-01|Cilag Gmbh International|Powered surgical tool with predefined adjustable control algorithm for controlling end effector parameter| US10932872B2|2017-12-28|2021-03-02|Ethicon Llc|Cloud-based medical analytics for linking of local usage trends with the resource acquisition behaviors of larger data set| US11166772B2|2017-12-28|2021-11-09|Cilag Gmbh International|Surgical hub coordination of control and communication of operating room devices| US11069012B2|2017-12-28|2021-07-20|Cilag Gmbh International|Interactive surgical systems with condition handling of devices and data capabilities| US10966791B2|2017-12-28|2021-04-06|Ethicon Llc|Cloud-based medical analytics for medical facility segmented individualization of instrument function| US20190201146A1|2017-12-28|2019-07-04|Ethicon Llc|Safety systems for smart powered surgical stapling| US11051876B2|2017-12-28|2021-07-06|Cilag Gmbh International|Surgical evacuation flow paths| US11109866B2|2017-12-28|2021-09-07|Cilag Gmbh International|Method for circular stapler control algorithm adjustment based on situational awareness| US11213359B2|2017-12-28|2022-01-04|Cilag Gmbh International|Controllers for robot-assisted surgical platforms| US11160605B2|2017-12-28|2021-11-02|Cilag Gmbh International|Surgical evacuation sensing and motor control| US20190201087A1|2017-12-28|2019-07-04|Ethicon Llc|Smoke evacuation system including a segmented control circuit for interactive surgical platform| US20190274716A1|2017-12-28|2019-09-12|Ethicon Llc|Determining the state of an ultrasonic end effector| US11253315B2|2017-12-28|2022-02-22|Cilag Gmbh International|Increasing radio frequency to create pad-less monopolar loop| US11259830B2|2018-03-08|2022-03-01|Cilag Gmbh International|Methods for controlling temperature in ultrasonic device| US11166716B2|2018-03-28|2021-11-09|Cilag Gmbh International|Stapling instrument comprising a deactivatable lockout| US11207067B2|2018-03-28|2021-12-28|Cilag Gmbh International|Surgical stapling device with separate rotary driven closure and firing systems and firing member that engages both jaws while firing| US11197668B2|2018-03-28|2021-12-14|Cilag Gmbh International|Surgical stapling assembly comprising a lockout and an exterior access orifice to permit artificial unlocking of the lockout| US20190298350A1|2018-03-28|2019-10-03|Ethicon Llc|Methods for controlling a powered surgical stapler that has separate rotary closure and firing systems| US10973520B2|2018-03-28|2021-04-13|Ethicon Llc|Surgical staple cartridge with firing member driven camming assembly that has an onboard tissue cutting feature| US11096688B2|2018-03-28|2021-08-24|Cilag Gmbh International|Rotary driven firing members with different anvil and channel engagement features| US11219453B2|2018-03-28|2022-01-11|Cilag Gmbh International|Surgical stapling devices with cartridge compatible closure and firing lockout arrangements| US11090047B2|2018-03-28|2021-08-17|Cilag Gmbh International|Surgical instrument comprising an adaptive control system| US11213294B2|2018-03-28|2022-01-04|Cilag Gmbh International|Surgical instrument comprising co-operating lockout features| US20200015902A1|2018-07-16|2020-01-16|Ethicon Llc|Force sensor through structured light deflection| US11207065B2|2018-08-20|2021-12-28|Cilag Gmbh International|Method for fabricating surgical stapler anvils| US11253256B2|2018-08-20|2022-02-22|Cilag Gmbh International|Articulatable motor powered surgical instruments with dedicated articulation motor arrangements| US11259807B2|2019-02-19|2022-03-01|Cilag Gmbh International|Staple cartridges with cam surfaces configured to engage primary and secondary portions of a lockout of a surgical stapling device| US11147553B2|2019-03-25|2021-10-19|Cilag Gmbh International|Firing drive arrangements for surgical systems| US11147551B2|2019-03-25|2021-10-19|Cilag Gmbh International|Firing drive arrangements for surgical systems| US11172929B2|2019-03-25|2021-11-16|Cilag Gmbh International|Articulation drive arrangements for surgical systems| US11253254B2|2019-04-30|2022-02-22|Cilag Gmbh International|Shaft rotation actuator on a surgical instrument| US11241235B2|2019-06-28|2022-02-08|Cilag Gmbh International|Method of using multiple RFID chips with a surgical assembly| US11259803B2|2019-06-28|2022-03-01|Cilag Gmbh International|Surgical stapling system having an information encryption protocol| US11246678B2|2019-06-28|2022-02-15|Cilag Gmbh International|Surgical stapling system having a frangible RFID tag| US11224497B2|2019-06-28|2022-01-18|Cilag Gmbh International|Surgical systems with multiple RFID tags| US11234698B2|2019-12-19|2022-02-01|Cilag Gmbh International|Stapling system comprising a clamp lockout and a firing lockout| US11219501B2|2019-12-30|2022-01-11|Cilag Gmbh International|Visualization systems using structured light|
法律状态:
2021-06-08| B08F| Application dismissed because of non-payment of annual fees [chapter 8.6 patent gazette]|Free format text: REFERENTE A 3A ANUIDADE. | 2021-09-14| B08G| Application fees: restoration [chapter 8.7 patent gazette]| 2021-12-07| B350| Update of information on the portal [chapter 15.35 patent gazette]|
优先权:
[返回顶部]
申请号 | 申请日 | 专利标题 US201762611339P| true| 2017-12-28|2017-12-28| US201762611341P| true| 2017-12-28|2017-12-28| US201762611340P| true| 2017-12-28|2017-12-28| US62/611,339|2017-12-28| US62/611,340|2017-12-28| US62/611,341|2017-12-28| US201862649291P| true| 2018-03-28|2018-03-28| US62/649,291|2018-03-28| US15/940,742|US20190200906A1|2017-12-28|2018-03-29|Dual cmos array imaging| US15/940,742|2018-03-29| PCT/IB2018/055698|WO2019130074A1|2017-12-28|2018-07-30|Dual cmos array imaging| 相关专利
Sulfonates, polymers, resist compositions and patterning process
Washing machine
Washing machine
Device for fixture finishing and tension adjusting of membrane
Structure for Equipping Band in a Plane Cathode Ray Tube
Process for preparation of 7 alpha-carboxyl 9, 11-epoxy steroids and intermediates useful therein an
国家/地区
|